Generalization in Neural Networks

Thumbnail

Event details

Date 29.08.2019
Hour 10:3012:30
Speaker Berfin Simsek
Location
Category Conferences - Seminars
EDIC candidacy exam
Exam president: Prof. Ruediger Urbanke
Thesis advisor: Prof. Clement Hongler
Thesis co-advisor: Prof. Wulfram Gerstner
Co-examiner: Prof. Matthieu Wyart

Abstract
Neural networks are found to be violating the biasvariance rule, increasing the network size yields better generalization performance. As standard tools in statistical learning theory fail to explain this generalization trend, there has been a surge of interest in finding a capacity measure that estimates the generalization performance. We will present three works that search a valid capacity measure for neural networks using different techniques and discuss the strengths and weaknesses of their approach. We will point to underexplored directions in the last section.

Background papers
Exploring Generalization in Deep Learning, by Neyshabur, B., et al.
Fine-Grained Analysis of Optimization and Generalization for Overparameterized Two-Layer Neural Networks, by Arora, S., et al.
Spectrally-normalized margin bounds for neural networks, by Bartlett, P., et al.

 

Practical information

  • General public
  • Free

Tags

EDIC candidacy exam

Share