BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Adaptive gradient algorithm on non-convex optimization
DTSTART:20190828T130000
DTEND:20190828T150000
DTSTAMP:20260415T235714Z
UID:4df7267001fd04e5537f0ca723909e5a718b7fb2a658a4ccd27f122d
CATEGORIES:Conferences - Seminars
DESCRIPTION:Chaehwan Song\nEDIC candidacy exam\nExam president: Prof. Patr
 ick Thiran\nThesis advisor: Prof. Volkan Cevher\nCo-examiner: Prof. Ali Sa
 yed\n\nAbstract\nFirst-order gradient method is the most basic and powerfu
 l strategy for the optimization problem\, and its application lies nearly 
 every field of science and technology. Adaptive gradient method is a varia
 nt of first-order gradient descent method\, and numerous experiments shows
  its ability about fast and robust convergence by automatically adjusting 
 the learning rate by its past gradients. Therefore\, adaptive gradient met
 hod has various applications\, especially for neural network training.  H
 owever\, most of its convergence analysis are focusing on convex optimizat
 ion problem and analysis of adaptive method for nonconvex setting has been
  just started getting interested recently. Our research goal is to propose
  solid convergence analysis for modern adaptive gradient methods for nonco
 nvex optimization problem. In this proposal\, we discuss three related pie
 ces of research. We first start with the introduction of adaptive gradient
  method and some famous algorithms\, from traditional Adagrad to modern me
 thods including Accelegrad. Then we discuss the pros and cons of these met
 hods\, comparing with plain gradient descent(GD) and stochastic gradient d
 escent(SGD). Finally\, we introduce recent research about the convergence 
 analysis of adaptive gradient method for nonconvex optimization.\n\nBackgr
 ound papers\nThe Marginal Value of Adaptive Gradient Methods in Machine Le
 arning\, by Wilson\, A.\,C.\, et al.\nOnline Adaptive Methods\, Universali
 ty and Acceleration\, by Kfir Y. Levy\, Alp Yurtsever\, Volkan Cevher.\n
 On the Convergence of A Class of Adam-Type Algorithms for Non-Convex Optim
 ization\, by Chen\, X.\, et al.\n\n\n 
LOCATION:DIA 004 https://plan.epfl.ch/?room==DIA%20004
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
