BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Mean field approach to learning dynamics in neural networks
DTSTART:20220829T100000
DTEND:20220829T120000
DTSTAMP:20260509T210112Z
UID:aa3dda90b9f0c4a8ed0a91cf4e68dcebbf7c420766097243c405e716
CATEGORIES:Conferences - Seminars
DESCRIPTION:Emanuele Troiani\nEDIC candidacy exam\nExam president: Prof. F
 lorent Krzakala\nThesis advisor: Prof. Lenka Zdeborová\nCo-examiner: Prof
 . Nicolas Macris\n\nAbstract\nGradient based optimisation algorithms are i
 nvaluable\ntools for training neural networks. Over the years a number\nof
  variants and optimisations have been proposed that empirically increase p
 erformance.\nWhile their advantage can be measured in\npractice\, a compre
 hensive theoretical description of the dynamics\,\nespecially in non conve
 x landscapes\, is still in the making. In\nthe following we review an earl
 y approach to the dynamics of\nlinear network\, as well as an effective de
 scription of mean field\ndynamics in shallow neural networks\, which was r
 ecently proven\nto be rigorous.\nWe believe that this procedure can be exp
 anded\nto describe a much wider range of settings.\n\nBackground papers\nS
 tochasticity helps to navigate rough landscapes: comparing gradient-descen
 t-based algorithms in the phase retrieval problem\nhttps://iopscience.iop.
 org/article/10.1088/2632-2153/ac0615/pdf\n\nExact solutions to the nonline
 ar dynamics of learning in deep linear neural networks\nhttps://arxiv.org/
 pdf/1312.6120.pdf%C2%A0\n\nThe high-dimensional asymptotics of first order
  methods with random data\nhttps://arxiv.org/pdf/2112.07572.pdf\n 
LOCATION:
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
