About $f$-divergences, the Data Processing Inequality and their implications for statistics

Thumbnail

Event details

Date 13.02.2020
Hour 14:0016:00
Speaker Pierre Quinton
Location
Category Conferences - Seminars
EDIC candidacy exam
Exam president: Prof Michael Gastpar
Thesis Advisor: Prof Emre Telatar
Co-examiner: Dr. Nicolas Macris

Abstract
It is well known that some information measures satisfy the Data Processing Inequality (DPI), meaning that processing
random variables through a Markov Kernel cannot increase information.
Typically, estimation happens under the assumption that the phenomenon, the observation and the estimator form a Markov Chain.
This can be relaxed by instead assuming that the information between the phenomenon and the estimator is lower than the information
 between the phenomenon and the observation. The only dependence of the optimization on the observation is then its mutual information with the phenomenon. This renders estimation easier but the solution to minimization under information constraint is not technically an estimator and this creates a gap in distortion.
Sometimes one can reduce or even  close the gap by choosing the right information measure.
And in this regard, we will restrict our attention to a specific class of information measures, known as $f$-divergences.
For such cases where  there is no information measure that closes the gap, another approach could be to explore the tightness of the DPI associated with some chosen measure.
In turn, this is related to analyzing the contraction parameter of, what is known in the literature as, Strong DPI.
This document addresses three works on these topics, the first one concerns relationships among $f$-divergences,
the second one explores the strong data processing inequality and the last shows a way of doing estimation under mutual information constraints.

Background papers
On pairs of f -divergences and their joint range, by P. Harremoës and I. Vajda, arXiv preprint arXiv:1007.0097, 2010.
On hypercon-tractivity and a data processing inequality, by V. Anantharam, A. Gohari, S. Kamath, and C. Nair, 2014 IEEE International
Symposium on Information Theory, pp. 3022–3026, IEEE, 2014.
The information bottleneck method, by N. Tishby, F. C. Pereira, and W. Bialek, arXiv preprint physics/0004057, 2000.



 

Practical information

  • General public
  • Free

Contact

Tags

EDIC candidacy exam

Share