BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Accumulation points of gradient-based optimization methods: condit
 ions for point-wise convergence and saddle avoidance
DTSTART:20230707T151500
DTEND:20230707T171500
DTSTAMP:20260407T183851Z
UID:6a05e660e45432a7ead6000c47a289cf3c37c00b9122b6e9d29c806b
CATEGORIES:Conferences - Seminars
DESCRIPTION:Andreea-Alexandra Musat\nEDIC candidacy exam\nExam president: 
 Prof. Nicolas Flammarion\nThesis advisor: Prof. NIcolas Boumal\nCo-examine
 r: Prof. Lénaïc Chizat\n\nAbstract\nStandard convergence results in opti
 mization guarantee that accumulation points of the sequence of iterates ar
 e critical points of the cost function. However\, this does not imply that
  the sequence converges\, nor does it exclude convergence to bad critical 
 points\, such as saddles. We review a set of sufficient conditions for ens
 uring that the iterates converge and demonstrate how classic algorithms sa
 tisfy these conditions under their usual assumptions. Additionally\, we sk
 etch the main arguments for showing that gradient descent is unlikely to c
 onverge to saddles exhibiting a direction of negative curvature\, thus est
 ablishing almost sure convergence to a local minimum\n\nBackground papers\
 n\n	First-order methods almost always avoid strict saddle points --- J. D
 . Lee\, I\, Panageas\, G. Piliouras\, M. Simchowitz\, M. I. Jordan & B. Re
 cht --- paper link\n	Convergence of the Iterates of Descent Methods for A
 nalytic Cost Functions ---  P. A. Absil\, R. Mahony\, and B. Andrews --
 - paper link\n	Global convergence of the gradient method for functions def
 inable in o-minimal structures --- C. Josz --- paper link \n
LOCATION:BC 233 https://plan.epfl.ch/?room==BC%20233
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
