BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:IC Colloquium: Second-Order Model Compression at Scale: How to Eff
 iciently Run Your 175-Billion Parameter Model on a Single GPU
DTSTART:20221103T161500
DTEND:20221103T173000
DTSTAMP:20260406T173012Z
UID:d5f2a8175c547b61bd2c25cacc42d5e875e767f64408b3abec5ada41
CATEGORIES:Conferences - Seminars
DESCRIPTION:By: Dan Alistarh - IST Austria\n\nAbstract\nA key barrier to t
 he wide deployment of highly-accurate machine learning models is their hi
 gh computational and memory overhead. Although we have the mathematical to
 ols for highly-accurate compression of such models\, for instance via the
  Optimal Brain Surgeon framework (LeCun et al.\, 1990) and its many extens
 ions\, these theoretically-elegant techniques require second-order (curvat
 ure) information of the model’s loss function\, which is hard to even ap
 proximate efficiently at scale.\n\nIn this talk\, I will describe our work
  on bridging this computational divide\, which enables for the first time 
 accurate second-order pruning and quantization of models at truly massive 
 scale. Our running example will be the 175-Billion-parameter GPT-3/OPT lan
 guage generation model: compressed using our techniques\, it can now be ru
 n efficiently on a single GPU\, with negligible accuracy loss.\n\nBio\nDa
 n Alistarh is a Professor at IST Austria\, in Vienna. Previously\, he was 
 a Researcher with Microsoft\, and a Postdoc at MIT CSAIL. He received hi
 s PhD from the EPFL\, under the brilliant guidance of Prof. Rachid Guerrao
 ui. His research is on algorithms for efficient machine learning and high-
 performance computing\, with a focus on scalable DNN inference and traini
 ng\, for which he was awarded an ERC Starting Grant in 2018. In his spar
 e time\, he leads the ML research team at Neural Magic\, a startup based
  in Boston\, MA. \n\nMore information
LOCATION:BC 420 https://plan.epfl.ch/?room==BC%20420 https://epfl.zoom.us/
 j/64611494153
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
