Information theory with kernel methods

Thumbnail

Event details

Date 07.02.2024
Hour 13:1515:15
Speaker Francis Bach    
Location
Category Conferences - Seminars
Event Language English

Estimating and computing entropies of probability distributions are key computational tasks throughout data science. In many situations the underlying distributions are only known through the expectation of some feature vectors which has led to a series of works within kernel methods with applications to generative modeling and probabilistic inference. In this talk I will explore the particular situation where the feature vector is a rank-one positive definite matrix and show how the associated expectations (a covariance matrix) can be used with information divergences from quantum information theory to draw direct links with the classical notions of Shannon entropies.

Practical information

  • Informed public
  • Free

Organizer

  • Lénaïc Chizat Matthieu Wyart

Share