BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:IC Colloquium : Optimization\, Learning and Systems
DTSTART:20160215T101500
DTEND:20160215T113000
DTSTAMP:20260407T115126Z
UID:5697fad9cc1e042896beb8ace92eb36987619c0fa0d409f552c038d7
CATEGORIES:Conferences - Seminars
DESCRIPTION:By : Martin Jaggi - ETHZ\nIC Faculty candidateAbstract :\nThe 
 massive growth of available data has moved data analysis and machine learn
 ing to center stage in many industrial as well as scientific fields\, rang
 ing from astronomy to health science\, web and sensor data\, and countless
  other applications. In this talk\, we focus on the computational challeng
 es of machine learning on large datasets through the lens of mathematical 
 optimization.\nSignificant recent research aims to improve the efficiency\
 , scalability\, and theoretical understanding of iterative optimization al
 gorithms used for training machine learning models. Motivated by widely us
 ed regression and classification techniques\, we discuss new results for f
 irst-order optimization techniques\, empowering them with primal-dual cert
 ificates and convergence guarantees which are valuable for practitioners. 
 We also illustrate how Frank-Wolfe and coordinate descent algorithms can h
 elp to trade-off accuracy against complexity (measured for example in spar
 sity or rank) of machine learning models.\nWhile single machine solvers ha
 ve been highly optimized\, they can not easily be transferred to the distr
 ibuted setting\, i.e. when the dataset exceeds the storage capacity of a s
 ingle computer. For the users of machine learning methods\, this lack of g
 enerality of learning algorithms is becoming increasingly frustrating\, as
  the complexity and variability of the underlying systems is increasing\, 
 for example in terms of differences in communication\, computation and mem
 ory speeds.\nThis highlights the necessity for new learning algorithms whi
 ch are able to efficiently adapt to the available compute environment\, sp
 anning a wide range from cheap public cloud computing stacks to more tradi
 tional HPC systems. At the same time\, the theoretical efficiency guarante
 es should ideally be adaptive to the real system properties as well. Final
 ly\, open source optimization software combined with public benchmarks can
  help industrial users navigate the increasingly complex landscape of comm
 ercial big data software frameworks.Bio :\nMartin Jaggi is a post-doctoral
  researcher in machine learning at ETH Zurich. Before that\, he was a rese
 arch fellow at the Simons Institute in Berkeley\, US\, working on the theo
 ry of big data analysis\, and a post-doctoral researcher at École Polytec
 hnique in Paris\, France. He has earned his PhD in Machine Learning and Op
 timization from ETH Zurich in 2011\, and a MSc in Mathematics also from ET
 H Zurich\, interrupted with several shorter stints in industry (Google\, N
 etbreeze\, Avaloq). He is broadly interested in methods for the analysis o
 f large datasets\, distributed training algorithms\, open source software 
 and machine learning applications for example in medicine\, computer visio
 n and text analysis. He is a co-founder of the startup SpinningBytes.com\,
  and also the founder and co-organizer of the Zurich Machine Learning and 
 Data Science Meetup\, the largest technology Meetup group in Switzerland.M
 ore information
LOCATION:BC 420 https://plan.epfl.ch/?room==BC%20420
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
