BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:AI Center - Research Seminar Series - Francis Bach
DTSTART:20250904T160000
DTEND:20250904T170000
DTSTAMP:20260501T160643Z
UID:7b8e6baab71321d3a8de3322921b8de574f4e73ca6409cbbcb9bef68
CATEGORIES:Conferences - Seminars
DESCRIPTION:Francis Bach\nThe talk is organized by the EPFL AI Center a
 s part of the its main Research seminar series.\n\nTitle\nOptimization in 
 Machine Learning: From Convexity to Non-Convexity\n\nAbstract\nOptimizatio
 n algorithms—such as gradient descent and its stochastic variants—are 
 fundamental tools in modern machine learning. Over the past fifteen years\
 , the research landscape has evolved significantly: the early emphasis on 
 convex optimization with strong theoretical guarantees (particularly for l
 inear models) has gradually shifted toward the challenges of non-convex op
 timization\, which underpins more complex models like neural networks and 
 often lacks such guarantees. In this talk\, I will survey key theoretical 
 insights and empirical findings from both domains\, highlighting the role 
 of convexity—whether explicit or implicit—in shaping our understanding
  of optimization. I will also discuss emerging directions for future resea
 rch\, both within machine learning and in the broader context of optimizat
 ion theory.\n\nBio\nFrancis Bach is a researcher at Inria\, leading since 
 2011 the machine learning team which is part of the Computer Science depar
 tment at Ecole Normale Supérieure. He graduated from Ecole Polytechnique 
 in 1997 and completed his Ph.D. in Computer Science at U.C. Berkeley in 20
 05\, working with Professor Michael Jordan. He spent two years in the Math
 ematical Morphology group at Ecole des Mines de Paris\; then he joined the
  computer vision project-team at Inria/Ecole Normale Supérieure from 2007
  to 2010. Francis Bach is primarily interested in machine learning\, and e
 specially in sparse methods\, kernel-based learning\, neural networks\, an
 d large-scale optimization. He published the book "Learning Theory from Fi
 rst Principles" through MIT Press in 2024. He obtained in 2009 a Starting 
 Grant and in 2016 a Consolidator Grant from the European Research Council\
 , and received the Inria young researcher prize in 2012\, the ICML test-of
 -time award in 2014 and 2019\, the NeurIPS test-of-time award in 2021\, as
  well as the Lagrange prize in continuous optimization in 2018\, and the J
 ean-Jacques Moreau prize in 2019. He was elected in 2020 at the French Aca
 demy of Sciences. In 2015\, he was program co-chair of the International C
 onference in Machine learning (ICML)\, general chair in 2018\, and preside
 nt of its board between 2021 and 2023\; he was co-editor-in-chief of the J
 ournal of Machine Learning Research between 2018 and 2023.\n 
LOCATION:MA B1 11 https://plan.epfl.ch/?room==MA%20B1%2011 https://epfl.zo
 om.us/j/69565733001?pwd=JsMLTp4pRGaGldbBYRzrMuzixTi7Fr.1
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
