BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Theory of Neural Nets Seminar: 10th May 2021
DTSTART:20210510T163000
DTEND:20210510T173000
DTSTAMP:20260427T210156Z
UID:decf2c58c4e2bf60bce70c8f9edca809af27cb1ed3f0725b1309b3a8
CATEGORIES:Conferences - Seminars
DESCRIPTION:Greg Yang (Microsoft Research)\nThis seminar consists of talks
  about current research on the theory of neural networks. Every session la
 sts one hour and comprises a talk (about 30 minutes) followed by a discuss
 ion with questions from the audience.\n\nSpeaker: Greg Yang (Microsoft Res
 earch)\n\nTitle: Feature Learning in Infinite-Width Neural Networks\n\nAb
 stract: As its width tends to infinity\, a deep neural network’s behavi
 or under gradient descent can become simplified and predictable (e.g. give
 n by the Neural Tangent Kernel (NTK))\, if it is parametrized appropriatel
 y (e.g. the NTK parametrization). However\, we show that the standard and 
 NTK parametrizations of a neural network do not admit infinite-width limit
 s that can learn representations (i.e. features)\, which is crucial for pr
 etraining and transfer learning such as with BERT. We propose simple modif
 ications to the standard parametrization to allow for feature learning in 
 the limit. Using the *Tensor Programs* technique\, we derive explicit form
 ulas for such limits. On Word2Vec and few-shot learning on Omniglot via MA
 ML\, two canonical tasks that rely crucially on feature learning\, we comp
 ute these limits exactly. We find that they outperform both NTK baselines 
 and finite-width networks\, with the latter approaching the infinite-width
  feature learning performance as width increases.  \nMore generally\, we
  classify a natural space of neural network parametrizations that generali
 zes standard\, NTK\, and Mean Field parametrizations. We show 1) any param
 etrization in this space either admits feature learning or has an infinite
 -width training dynamics given by kernel gradient descent\, but not both\;
  2) any such infinite-width limit can be computed using the Tensor Program
 s technique.
LOCATION:https://epfl.zoom.us/j/62277760614?pwd=cGhIUlNqSGhYNU1QVVVKZWYrc0
 tLZz09
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
