Understanding Width and Depth in Neural Networks: A Signal Propagation Approach

Thumbnail

Event details

Date 23.06.2023
Hour 13:1514:15
Speaker Soufiane Hayou
Location
Category Conferences - Seminars
Event Language English

The study of signal propagation in deep neural networks has yielded a number of interesting discoveries, both theoretical and practical. These include insights into neural network scaling, initialization schemes, and efficient feature learning. By default, this framework considers the infinite width limit of the covariance kernel within a network, while maintaining a fixed depth. However, recent research indicates that this framework might not accurately capture numerous practical scenarios where the width is e.g. comparable to the depth. In this presentation, I will discuss various scaling regimes in deep networks and show that when the width and depth of a (properly scaled) deep neural network with skip connections are taken to infinity, the resulting covariance structure remains invariant regardless of how this limit is taken. This result has both theoretical and practical implications, which I will cover in this talk.

Practical information

  • Informed public
  • Free

Organizer

  • Lénaïc Chizat François Ged

Tags

Theory of deep learning

Share