BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Understanding Width and Depth in Neural Networks: A Signal Propaga
 tion Approach
DTSTART:20230623T131500
DTEND:20230623T141500
DTSTAMP:20260414T032059Z
UID:8acea5b0c466466fe6ab6c2d42a2c988b775a19fd372c42d75ad489c
CATEGORIES:Conferences - Seminars
DESCRIPTION:Soufiane Hayou\nThe study of signal propagation in deep neural
  networks has yielded a number of interesting discoveries\, both theoretic
 al and practical. These include insights into neural network scaling\, ini
 tialization schemes\, and efficient feature learning. By default\, this fr
 amework considers the infinite width limit of the covariance kernel within
  a network\, while maintaining a fixed depth. However\, recent research in
 dicates that this framework might not accurately capture numerous practica
 l scenarios where the width is e.g. comparable to the depth. In this prese
 ntation\, I will discuss various scaling regimes in deep networks and show
  that when the width and depth of a (properly scaled) deep neural network 
 with skip connections are taken to infinity\, the resulting covariance str
 ucture remains invariant regardless of how this limit is taken. This resul
 t has both theoretical and practical implications\, which I will cover in 
 this talk.
LOCATION:GA 3 21 https://plan.epfl.ch/?room==GA%203%2021
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
