BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Generalization in Neural Networks
DTSTART:20190829T103000
DTEND:20190829T123000
DTSTAMP:20260501T211322Z
UID:882da97557f79ba191dc1e92a5a78f52b3ea2da033f4a396afb66549
CATEGORIES:Conferences - Seminars
DESCRIPTION:Berfin Simsek\nEDIC candidacy exam\nExam president: Prof. Rued
 iger Urbanke\nThesis advisor: Prof. Clement Hongler\nThesis co-advisor: Pr
 of. Wulfram Gerstner\nCo-examiner: Prof. Matthieu Wyart\n\nAbstract\nNeura
 l networks are found to be violating the biasvariance rule\, increasing th
 e network size yields better generalization performance. As standard tools
  in statistical learning theory fail to explain this generalization trend\
 , there has been a surge of interest in ﬁnding a capacity measure that e
 stimates the generalization performance. We will present three works that 
 search a valid capacity measure for neural networks using different techni
 ques and discuss the strengths and weaknesses of their approach. We will p
 oint to underexplored directions in the last section.\n\nBackground papers
 \nExploring Generalization in Deep Learning\, by Neyshabur\, B.\, et al.\n
 Fine-Grained Analysis of Optimization and Generalization for Overparameter
 ized Two-Layer Neural Networks\, by Arora\, S.\, et al.\nSpectrally-normal
 ized margin bounds for neural networks\, by Bartlett\, P.\, et al.\n\n 
LOCATION:MA A1 12 https://plan.epfl.ch/?room==MA%20A1%2012
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
