Robustness, Stability and the pNML method

Thumbnail

Event details

Date 19.07.2019
Hour 13:0015:00
Speaker Andreas Maggiori
Location
Category Conferences - Seminars
EDIC candidacy exam
Exam president: Prof. Michael Kapralov
Thesis advisor: Prof. Ruediger Urbanke
Thesis co-advisor: Prof. Ola Svensson
Co-examiner: Dr. Olivier Lévêque


Abstract
Traditional approaches fail to explain the recent success story of machine learning. Furthermore, common practices in ML are lacking a deep theoretical understanding. In this proposal, we investigate the relation between the generalization of a learning algorithm, its robustness, and its stability. Moreover, we present an alternative supervised learning framework where the goal is to obtain worst case guarantees. The objective is to explore the connections between these notions and apply similar techniques to further analyze and extend common ML algorithms.

Background papers
Robustness and Generalization, by Huan Xu, Shie Mannor.
Train faster, generalize better: Stability of stochastic gradient descent, by Moritz Hardt, Benjamin Recht, Yoram Singer.
Universal Supervised Learning for Individual Data, by Yaniv Fogel, Meir Feder.

 

Practical information

  • General public
  • Free

Tags

EDIC candidacy exam

Share