BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Entropy and mutual information in models of deep neural networks
DTSTART:20181102T140000
DTEND:20181102T150000
DTSTAMP:20260506T084527Z
UID:1c43809f49724c2a7763da13a7490c9e6fc0f276e31e6d5d6f552065
CATEGORIES:Conferences - Seminars
DESCRIPTION:Marylou Gabrié\nThe successes and the multitude of applicatio
 ns of deep learning methods have spurred efforts towards quantitative mode
 ling of the performance of deep neural networks. In particular\, an inform
 ation-theoretic approach linking generalization capabilities to compressio
 n has been receiving increasing interest. Nevertheless\, it is in practice
  computationally intractable to compute entropies and mutual informations 
 in industry-sized neural networks. In this talk\, we will consider instead
  a class of models of deep neural networks\, for which an expression for t
 hese information-theoretic quantities can be derived from the replica meth
 od. We will examine how mutual informations between hidden and input varia
 bles can be reported along the training of such neural networks on synthet
 ic datasets. Finally we will discuss the numerical results of a few traini
 ng experiments.\n\nThis work was done in collaboration with Andre Manoel (
 Owkin)\, Clément Luneau (EPFL)\, Jean Barbier (EPFL)\, Nicolas Macris (EP
 FL)\, Florent Krzakala (LPS ENS) and Lenka Zdeborova (IPHT CEA).
LOCATION:BSP 727 https://plan.epfl.ch/?room==BSP%20727
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
