Various Perspectives on the Contraction of Markov Kernels

Thumbnail

Event details

Date 02.09.2024
Hour 13:0015:00
Speaker Adrien Vandenbroucque
Location
Category Conferences - Seminars
EDIC candidacy exam
Exam president: Prof. Emre Telatar
Thesis advisor: Prof. Michael Gastpar
Co-examiner: Prof. Nicolas Macris

Abstract
In recent years, understanding the speed of convergence of Markov chains to their stationary distributions has gained much attention due to its relevance in sampling from complex probability measures. In this proposal, we study the contraction of Markov kernels from multiple angles. First, we consider the evolution of Markov processes under various types of functional inequalities, such as (modified) log-Sobolev inequalities and Poincaré inequalities. We then explore a link with information theory through the well-known data processing inequality. Specifically, we highlight the link between the best constant of the log-Sobolev inequality and the so-called strong data-processing constant of discrete channels. Finally, we turn to non-linear Sobolev-type inequalities and present the improvements achievable in terms of strength of the contraction Markov kernels.

Background papers
- Modified log-sobolev inequalities, mixing and hypercontractivity (https://dl.acm.org/doi/10.1145/780542.780586)
- Logarithmic Sobolev Inequalities and Strong Data Processing Theorems for Discrete Channels (https://ieeexplore.ieee.org/document/6620260)
- Improved Log-Sobolev Inequalities, Hypercontractivity and Uncertainty Principle on the Hypercube (https://www.sciencedirect.com/science/article/pii/S0022123619302435)

Practical information

  • General public
  • Free

Tags

EDIC candidacy exam

Share