Optimization algorithms for heterogeneous federated learning

Thumbnail

Event details

Date 07.12.2021
Hour 17:0018:00
Location Online
Category Conferences - Seminars
Event Language English
Title for the talk: 
Optimization algorithms for heterogeneous federated learning
 
Abstract for the talk:
A traditional machine learning pipeline involves collecting massive amounts of data centrally on a server and training models to fit the data. However, increasing concerns about the privacy and security of user's data, combined with the sheer growth in the data sizes has incentivized looking beyond such traditional centralized approaches. Federated learning proposes instead for a network of data holders to collaborate together to train models without transmitting any data. This new paradigm minimizes data exposure, but inherently faces some fundamental optimization challenges posed by non iid data across the users' data. We will discuss our understanding, and progress in tackling these problems.
 
Papers covered:
1. SCAFFOLD: Stochastic Controlled Averaging for Federated Learning https://arxiv.org/abs/1910.06378
2. Mime: Mimicking Centralized Stochastic Algorithms in Federated Learning. https://arxiv.org/abs/2008.03606
 
Bio: 
Dr. Praneeth Karimireddy recently finished his PhD at EPFL advised by Prof. Martin Jaggi. His main research interest is developing intelligence infrastructure for collaborative learning. His research has been awarded with a Dimitris N. Chorafas Foundation Prize, and a best paper award at FL-ICML 2021. Supported by an SNSF fellowship, he will be joining as a postdoc with Mike Jordan's group at UC Berkeley in Spring 2022.

Practical information

  • General public
  • Free

Share