Towards a complete theory of representation learning and generalization in linear Bayesian neural networks

Thumbnail

Event details

Date 26.01.2024
Hour 13:1514:15
Speaker Jacob Zavatone-Veth (Harvard) 
Location
Category Conferences - Seminars
Event Language English

Understanding how representation learning affects generalization is among the foremost goals of modern deep learning theory. In this talk, I will discuss the significant recent progress that has been made towards understanding perhaps the simplest toy model for deep representation learning: deep linear Bayesian neural networks. For these models, we can obtain a precise asymptotic characterization of generalization and representation learning, and in some cases even obtain closed-form solutions at finite size. I will conclude by commenting on remaining gaps in our understanding, and on transferrability of insights to nonlinear models. 

Practical information

  • Informed public
  • Free

Organizer

  • Lénaïc Chizat

Contact

  • lenaic.chizat@epfl.ch

Share