BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Practical Byzantine-Resilient Distributed Machine Learning
DTSTART:20180625T103000
DTEND:20180625T123000
DTSTAMP:20260407T183628Z
UID:e87034f6f25d1940768cee6b60c960a084262c5b0b5bff785d43413e
CATEGORIES:Conferences - Seminars
DESCRIPTION:Sébastien Rouault\nEDIC candidacy exam\nExam president: Prof.
  Martin Jaggi\nThesis advisor: Prof. Rachid Guerraoui\nCo-examiner: Prof. 
 Patrick Thiran\n\nAbstract\nThe resurgence of machine learning in the last
  two decades has been opening new technological doors\, empowering industr
 ies and the people who use it.\nTo make the most of ever-growing datasets 
 and tackle more complex problems\, machine learning researchers and practi
 tioners explore larger models.\nSuch endeavors demand equally substantial 
 computational power and\, as of today\, the only realistic approach to tra
 in such models consists of distributing the computational burden over nume
 rous machines.\n\nBut what if an adversarial entity controls some of these
  machines?\nIn the standard parameter server framework\, a single compromi
 sed machine can trivially stymie the learning process.\nWorst: with an est
 imation of the gradients computed by the "honest" machines\, this adversar
 y could as easily control the whole learning process.\nThis fact is concer
 ning given the impact machine learning already has on our society.\n\nI wo
 uld like to emphasize my research on theoretically sound and practical sol
 utions to this latent issue.\n\nBackground papers\nMachine Learning with A
 dversaries: Byzantine Tolerant Gradient Descent\, by Blanchard\, P.\, et a
 l.\nTensorFlow: A System for Large-Scale Machine Learning\, by Abadi\, M.\
 , et al.\nFederated learning: Strategies for improving communication effic
 iency\, by Konecny\, J.\, et al.
LOCATION:BC 329 https://plan.epfl.ch/?room==BC%20329
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
