Mock talk: Robust Distributed Learning and Robust Learning Machines

Thumbnail
Cancelled

Event details

Date 12.03.2020
Hour 15:0016:00
Speaker El Mahdi El Mhamdi
Location
Category Conferences - Seminars
Abstract:
Distributed optimization has been a blessing to scale-up machine learning and was enabled by the naturally parallelisable Stochastic Gradient Descent (SGD) algorithm. Distribution however creates inevitable robustness issues. In this talk, we study the robustness of distributed SGD and begin by proving that its standard deployment is brittle, as this deployment typically consists in averaging the inputs from each learner (or equally brittle variants of averaging). This leads to harmful consequences as the data that is used in machine learning comes from different and potentially unreliable sources. To account for the various types of failures (data poisoning, malicious users, software bugs, communication delays, hacked machines etc.), we adopt the general abstraction of arbitrary failures in distributed systems, namely, the Byzantine failures abstraction. We provide a sufficient condition for SGD to be Byzantine resilient and present three algorithms that satisfy our condition under different configurations.

The key algorithms that are introduced in this talk are (1) Krum, a gradient aggregation rule (GAR) that we prove to be a robust alternative to averaging in synchronous settings; (2) Bulyan, a meta-algorithm that we prove to strengthen any given GAR from in very high dimensional situations and (3) Kardam, a gradient filtering scheme that we prove to be Byzantine resilient in the more challenging asynchronous setting. For each of our algorithms, we also discuss a few variants as well as a discussion of their practical limitations.


Short bio
El Mahdi El Mhamdi graduated from the École Polytechnique and obtained a PhD in Computer Science at EPFL where his thesis is currently nominated for the best thesis award. He established several results on the vulnerability and safety of distributed machine learning algorithms, some of which are now the de facto standard to assess the robustness of distributed machine learning. In parallel to his PhD, he wrote a book on technical AI safety, published by EDP Sciences.
Before his PhD, he created Wandida a university-level library of video tutorials on computer science, physics and mathematics with over 4 million views and Mamfakinch, a news platform in Morocco that won the 2012 Google and Global Voices breaking borders award.
 

Practical information

  • General public
  • Free

Event broadcasted in

Share