Information theory, universal prediction and machine learning

Thumbnail

Event details

Date 08.07.2021
Hour 10:0012:00
Speaker Marco Bondaschi
Category Conferences - Seminars
EDIC candidacy exam
exam president: Prof. Michael Kapralov
thesis advisor: Prof. Michael Gastpar
co-examiner: Prof. Emre Telatar

Abstract
We describe the problem of universal prediction of sequences under logarithmic loss, and we review some applications of its results to lossless compression and machine learning. Furthermore, we discuss relations between universal prediction metrics and well known information theoretic quantities such as Renyi divergence and Sibson's alpha-mutual information. We also discuss some possible future research directions, such as the proposal of alternative estimators, as well as alternative regret measures under logarithmic loss.

Background papers
"The Context-Tree Weighting Method: Basic Properties” by F. M. J. Willems, Y. M. Shtarkov and T. J. Tjalkens: https://ieeexplore.ieee.org/document/382012
 "Minimax Rényi Redundancy” by S. Yagli, Y. Altuğ and S. Verdú: https://ieeexplore.ieee.org/document/8283574
 "Learning, compression, and leakage: Minimising classification error via meta-universal compression principles” by F. E. Rosas, P. A. M. Mediano and M. Gastpar: https://ieeexplore.ieee.org/document/9457579

Practical information

  • General public
  • Free

Tags

EDIC candidacy exam

Share