BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:FLAIR seminar: The Mysterious Optimization Dynamics of Deep Learni
 ng
DTSTART:20231208T131500
DTEND:20231208T141500
DTSTAMP:20260407T021039Z
UID:edaebf14849f297c9600e7fe5b44c4f49ab6e8ff105d150853c938f6
CATEGORIES:Conferences - Seminars
DESCRIPTION:Fabian Pedregosa\nGradient descent with large step sizes often
  exhibits a regime called the Edge of Stability\, characterized by an init
 ial increase in the largest eigenvalue of the loss Hessian (progressive sh
 arpening)\, followed by a stabilization near the maximum value (edge of st
 ability). This behavior is inconsistent with several widespread assumption
 s for optimization\, so understanding this phenomenon is crucial to unders
 tand and design better training methods. In the first part of the talk I
 ’ll describe empirical results providing evidence for this Edge of Stabi
 lity phenomenon. In the second part I’ll describe a simple and tractable
  model consisting of a quartic polynomial that provably exhibits Edge of S
 tability. Finally\, in the third part I’ll present empirical results des
 cribing the interplay between sharpness and step-size tuners. Understandin
 g this interplay is crucial for unlocking the full potential of automatic 
 step-size tuners.
LOCATION:GA 3 21 https://plan.epfl.ch/?room==GA%203%2021
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
