Theory of Neural Nets Seminar: 5th July 2021

Thumbnail

Event details

Date 05.07.2021
Hour 16:3017:30
Category Conferences - Seminars

This seminar consists of talks about current research on the theory of neural networks. Every session lasts one hour and comprises a talk (about 30 minutes) followed by a discussion with questions from the audience.

Speaker: Matus Telgarsky (University of Illinois)

Title: Logistic regression explicitly maximizes margins; should we stop training early?

Abstract: This talk will present two perspectives on the behavior of gradient descent with the logistic loss: on the one hand, it seems we should run as long as possible, and achieve good margins; on the other, stopping early seems necessary for noisy problems.  In the first part, focused on the linear case, a new perspective of explicit bias (rather than implicit bias) yields new analyses and algorithms with margin maximization rate as fast as 1/t^2 (whereas prior work had 1/sqrt{t} at best).  The second part, focused on shallow ReLU networks, argues that the margin bias might fail to be ideal, and that stopping early can achieve consistency and calibration for arbitrary classification problems.  Moreover, this early phase is still adaptive to data simplicity, but with a different bias than the margin bias.

Joint work with Ziwei Ji, Justin D. Li, Nati Srebro.

Links

Practical information

  • Expert
  • Free

Contact

  • François Ged: francois.ged[at]epfl.ch

Share