LASER: Linear Compression in Wireless Distributed Optimization

Thumbnail

Event details

Date 04.12.2023
Hour 16:1517:15
Speaker Dr. Ashok Makkuva (LINX)
Location
Category Conferences - Seminars
Event Language English
Abstract: Data-parallel SGD is the de facto algorithm for distributed optimization, especially for large scale machine learning. Despite its merits, communication bottleneck is one of its persistent issues. Most compression schemes to alleviate this either assume noiseless communication links, or fail to achieve good performance on practical tasks. In this work, we close this gap and introduce LASER: LineAr CompreSsion in WirEless DistRibuted Optimization. LASER capitalizes on the inherent low-rank structure of gradients and transmits them efficiently over the noisy channels. Whilst enjoying theoretical guarantees similar to those of the classical SGD, LASER shows consistent gains over baselines on a variety of practical benchmarks. In particular, it outperforms the state-of-the-art compression schemes on challenging computer vision and GPT language modeling tasks. On the latter, we obtain 50-64% improvement in perplexity over our baselines for noisy channels. (Joint work with Marco Bondaschi, Thijs Vogels, Martin Jaggi, Hyeji Kim, and Michael Gastpar.) 
Bio: Ashok is a postdoctoral researcher at EPFL with Michael Gastpar. He obtained his PhD in ECE from the University of Illinois at Urbana-Champaign in August 2022, with Pramod Viswanath and Sewoong Oh.  He is a recipient of Best Paper Award from ACM MobiHoc 2019.  For more details about him, please visit https://ashokvardhan.github.io/

Links

Practical information

  • Informed public
  • Free

Organizer

  • IPG Seminar    

Share