BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:About $f$-divergences\, the Data Processing Inequality and their i
 mplications for statistics
DTSTART:20200213T140000
DTEND:20200213T160000
DTSTAMP:20260406T185423Z
UID:311bb911daa345d1ada871eae9de57f452b0a3c90889bf932acfef1e
CATEGORIES:Conferences - Seminars
DESCRIPTION:Pierre Quinton\nEDIC candidacy exam\nExam president: Prof Mich
 ael Gastpar\nThesis Advisor: Prof Emre Telatar\nCo-examiner: Dr. Nicolas M
 acris\n\nAbstract\nIt is well known that some information measures satisfy
  the Data Processing Inequality (DPI)\, meaning that processing\nrandom va
 riables through a Markov Kernel cannot increase information.\nTypically\, 
 estimation happens under the assumption that the phenomenon\, the observat
 ion and the estimator form a Markov Chain.\nThis can be relaxed by instead
  assuming that the information between the phenomenon and the estimator is
  lower than the information\n between the phenomenon and the observation.
  The only dependence of the optimization on the observation is then its mu
 tual information with the phenomenon. This renders estimation easier but t
 he solution to minimization under information constraint is not technicall
 y an estimator and this creates a gap in distortion.\nSometimes one can re
 duce or even  close the gap by choosing the right information measure.\nA
 nd in this regard\, we will restrict our attention to a specific class of 
 information measures\, known as $f$-divergences.\nFor such cases where  t
 here is no information measure that closes the gap\, another approach coul
 d be to explore the tightness of the DPI associated with some chosen measu
 re.\nIn turn\, this is related to analyzing the contraction parameter of\,
  what is known in the literature as\, Strong DPI.\nThis document addresses
  three works on these topics\, the first one concerns relationships among 
 $f$-divergences\,\nthe second one explores the strong data processing ineq
 uality and the last shows a way of doing estimation under mutual informati
 on constraints.\n\nBackground papers\nOn pairs of f -divergences and their
  joint range\, by P. Harremoës and I. Vajda\, arXiv preprint arXiv:1007.0
 097\, 2010.\nOn hypercon-tractivity and a data processing inequality\, by
  V. Anantharam\, A. Gohari\, S. Kamath\, and C. Nair\, 2014 IEEE Internat
 ional\nSymposium on Information Theory\, pp. 3022–3026\, IEEE\, 2014.\nT
 he information bottleneck method\, by N. Tishby\, F. C. Pereira\, and W. 
 Bialek\, arXiv preprint physics/0004057\, 2000.\n\n\n\n 
LOCATION:INR 113 https://plan.epfl.ch/?room==INR%20113
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
