BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Distributed Learning under Communication Constraints
DTSTART:20190717T161500
DTEND:20190717T171500
DTSTAMP:20260505T222806Z
UID:8c4d7abf73b697ecd894f2e2f687d92ad8fe11af24e2b2f469cf37d2
CATEGORIES:Conferences - Seminars
DESCRIPTION:Prof. Ayfer Özgur Aydin - Stanford University\nWe develop an 
 information-theoretic framework for learning high-dimensional distribution
 s and their parameters under communication constraints. We use this framew
 ork  to prove tight minimax bounds for the distributed estimation of com
 mon statistical models (including the Gaussian location model\, discrete d
 istribution estimation\, non-parametric estimation etc.) Our results revea
 l how the communication constraint impacts the estimation accuracy. We the
 n use our theoretical framework to devise a communication-efficient distri
 buted training algorithm for deep neural networks. Experimental evaluation
  suggests an order of magnitude improvement in communication efficiency ov
 er state of the art gradient compression algorithms. 
LOCATION:INR 113 https://plan.epfl.ch/?room=INR113
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
