Statistical mechanics of transfer learning in fully-connected networks in the proportional limit

Thumbnail

Event details

Date 21.01.2025
Hour 10:1511:15
Speaker Federiga Gerace (SISSA)
Location
Category Conferences - Seminars
Event Language English

Transfer learning (TL) is a well-established machine learning technique to boost the generalization performance on a specific (target) task using information gained from a related (source) task, and it crucially depends on the ability of a network to learn useful features. Leveraging recent analytical
progress in the proportional regime of deep learning theory (i.e. the limit where the size of the training set P and the size of the hidden layers N are taken to infinity keeping their ratio α = P/N finite), in this work we develop a novel single-instance Franz-Parisi formalism that yields an effective theory for TL in fully-connected neural networks. Unlike the (lazy-training) infinite-width limit, where TL is ineffective, we demonstrate that in the proportional limit TL occurs due to a renormalized source-target kernel that quantifies their relatedness and determines whether TL is beneficial for generalization.

Practical information

  • Informed public
  • Free

Organizer

  • Lénaïc Chizat

Share