BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Mock talk: Robust Distributed Learning and Robust Learning Machine
 s
DTSTART:20200312T150000
DTEND:20200312T160000
DTSTAMP:20260407T002902Z
UID:85ef751ecf3c0ee437daca10d61f62c030709e2d5bc06028427b0018
CATEGORIES:Conferences - Seminars
DESCRIPTION:El Mahdi El Mhamdi\nAbstract:\nDistributed optimization has be
 en a blessing to scale-up machine learning and was enabled by the naturall
 y parallelisable Stochastic Gradient Descent (SGD) algorithm. Distribution
  however creates inevitable robustness issues. In this talk\, we study the
  robustness of distributed SGD and begin by proving that its standard depl
 oyment is brittle\, as this deployment typically consists in averaging the
  inputs from each learner (or equally brittle variants of averaging). This
  leads to harmful consequences as the data that is used in machine learnin
 g comes from different and potentially unreliable sources. To account for 
 the various types of failures (data poisoning\, malicious users\, software
  bugs\, communication delays\, hacked machines etc.)\, we adopt the genera
 l abstraction of arbitrary failures in distributed systems\, namely\, the 
 Byzantine failures abstraction. We provide a sufficient condition for SGD 
 to be Byzantine resilient and present three algorithms that satisfy our co
 ndition under different configurations.\n\nThe key algorithms that are int
 roduced in this talk are (1) Krum\, a gradient aggregation rule (GAR) that
  we prove to be a robust alternative to averaging in synchronous settings\
 ; (2) Bulyan\, a meta-algorithm that we prove to strengthen any given GAR 
 from in very high dimensional situations and (3) Kardam\, a gradient filte
 ring scheme that we prove to be Byzantine resilient in the more challengin
 g asynchronous setting. For each of our algorithms\, we also discuss a few
  variants as well as a discussion of their practical limitations.\n\n\nSho
 rt bio\nEl Mahdi El Mhamdi graduated from the École Polytechnique and obt
 ained a PhD in Computer Science at EPFL where his thesis is currently nomi
 nated for the best thesis award. He established several results on the vul
 nerability and safety of distributed machine learning algorithms\, some of
  which are now the de facto standard to assess the robustness of distribut
 ed machine learning. In parallel to his PhD\, he wrote a book on technical
  AI safety\, published by EDP Sciences.\nBefore his PhD\, he created Wandi
 da a university-level library of video tutorials on computer science\, phy
 sics and mathematics with over 4 million views and Mamfakinch\, a news pla
 tform in Morocco that won the 2012 Google and Global Voices breaking borde
 rs award.\n 
LOCATION:BC 03 https://plan.epfl.ch/?room==BC%2003
STATUS:CANCELLED
END:VEVENT
END:VCALENDAR
