BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Information theory\, universal prediction and machine learning
DTSTART:20210708T100000
DTEND:20210708T120000
DTSTAMP:20260510T221555Z
UID:97954ebbfdfcc82424af093898788853d23949c0e5a9123f7542eb85
CATEGORIES:Conferences - Seminars
DESCRIPTION:Marco Bondaschi\nEDIC candidacy exam\nexam president: Prof. Mi
 chael Kapralov\nthesis advisor: Prof. Michael Gastpar\nco-examiner: Prof. 
 Emre Telatar\n\nAbstract\nWe describe the problem of universal prediction 
 of sequences under logarithmic loss\, and we review some applications of i
 ts results to lossless compression and machine learning. Furthermore\, we 
 discuss relations between universal prediction metrics and well known info
 rmation theoretic quantities such as Renyi divergence and Sibson's alpha-m
 utual information. We also discuss some possible future research direction
 s\, such as the proposal of alternative estimators\, as well as alternativ
 e regret measures under logarithmic loss.\n\nBackground papers\n"The Conte
 xt-Tree Weighting Method: Basic Properties” by F. M. J. Willems\, Y. M.
  Shtarkov and T. J. Tjalkens: https://ieeexplore.ieee.org/document/38201
 2\n "Minimax Rényi Redundancy” by S. Yagli\, Y. Altuğ and S. Verdú:
  https://ieeexplore.ieee.org/document/8283574\n "Learning\, compression\
 , and leakage: Minimising classification error via meta-universal compress
 ion principles” by F. E. Rosas\, P. A. M. Mediano and M. Gastpar: ht
 tps://ieeexplore.ieee.org/document/9457579
LOCATION:
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
