FLAIR external seminar: New statistical phenomena for entropic optimal transport

Event details
Date | 24.04.2023 |
Hour | 13:15 |
Speaker | Austin J. Stromme |
Location | |
Category | Conferences - Seminars |
Title: New statistical phenomena for entropic optimal transport
Speaker: Austin J. Stromme (MIT)
Abstract: Optimal transport (OT) is a popular framework for comparing and interpolating probability measures, which has recently been used in diverse application areas throughout science, including in generative modeling, cellular biology, graphics, and beyond. Unfortunately, recent work has shown that OT suffers from a severe statistical curse of dimensionality. In practice, however, the un-regularized OT problem is less common than entropically regularized approximations, known as entropic OT, which afford the use of simpler and more scalable algorithms.
Motivated by its ubiquity in practice, as well as the curse of dimensionality for its un-regularized counterpart, in this talk we identify two new statistical phenomena for entropic OT in the form of bounds on the convergence rate of empirical quantities to their population counterparts. Our first set of bounds are for high-dimensional settings, and give totally dimension-free convergence, albeit with exponential dependence on the regularization parameter. And our second set of bounds are for data distributions with potentially small intrinsic dimension, in which case we show that the dimension-dependence is not only intrinsic to the data distributions, but in fact is automatically the minimum of the intrinsic dimensions of the two distributions at stake. We show that these phenomena hold for entropic OT value estimation, and more generally for the problems of entropic OT map and density estimation. We conclude with applications to transfer learning and trajectory reconstruction. Our simple proof techniques are inspired by convex optimization, and notably avoid empirical process theory almost entirely.
Based on joint work with Philippe Rigollet.
Speaker: Austin J. Stromme (MIT)
Abstract: Optimal transport (OT) is a popular framework for comparing and interpolating probability measures, which has recently been used in diverse application areas throughout science, including in generative modeling, cellular biology, graphics, and beyond. Unfortunately, recent work has shown that OT suffers from a severe statistical curse of dimensionality. In practice, however, the un-regularized OT problem is less common than entropically regularized approximations, known as entropic OT, which afford the use of simpler and more scalable algorithms.
Motivated by its ubiquity in practice, as well as the curse of dimensionality for its un-regularized counterpart, in this talk we identify two new statistical phenomena for entropic OT in the form of bounds on the convergence rate of empirical quantities to their population counterparts. Our first set of bounds are for high-dimensional settings, and give totally dimension-free convergence, albeit with exponential dependence on the regularization parameter. And our second set of bounds are for data distributions with potentially small intrinsic dimension, in which case we show that the dimension-dependence is not only intrinsic to the data distributions, but in fact is automatically the minimum of the intrinsic dimensions of the two distributions at stake. We show that these phenomena hold for entropic OT value estimation, and more generally for the problems of entropic OT map and density estimation. We conclude with applications to transfer learning and trajectory reconstruction. Our simple proof techniques are inspired by convex optimization, and notably avoid empirical process theory almost entirely.
Based on joint work with Philippe Rigollet.
Practical information
- Informed public
- Free
Organizer
- Lénaïc Chizat François Ged