Optimal Sampling for Approximate Gradient Descent

Thumbnail

Event details

Date 17.04.2024
Hour 16:1517:00
Speaker Philipp Trunschke, Centrale Nantes & Nantes Université.
Location Online
Category Conferences - Seminars
Event Language English

Title: Optimal sampling for stochastic gradient descent
Abstract: Approximating high-dimensional functions often requires optimising a loss functional that can be represented as an expected value. When computing this expectation is unfeasible, a common approach is to replace the exact loss with a Monte Carlo estimate before employing a standard gradient descent scheme. This results in the well-known stochastic gradient descent method. However, using an estimated loss instead of the true loss can result in a "generalisation error". Rigorous bounds for this error usually require strong compactness and Lipschitz continuity assumptions while providing a very slow decay with increasing sample size. This slow decay is unfavourable in settings where high accuracy is required or sample creation is costly. To address this issue, we propose a new approach that involves empirically (quasi-)projecting the gradient of the true loss onto local linearisations of the model class through an optimal weighted least squares method. The resulting optimisation scheme converges almost surely to a stationary point of the true loss, and we investigate its convergence rate.

The speaker of this talk is Philippe Trunschke (Centrale Nantes & Nantes Université).

Philipp TRUNSCHKE studied Mathematics at the Humboldt University in Berlin, specialising in statistical learning theory.
He completed his doctoral studies focusing on tensor product approximation at the Technical University of Berlin 2018.
Currently, he is working with Anthony NOUY in Nantes on compositional function networks and optimal sampling.

This online talk is part of the EPFL SIAM Student Chapter's seminar series.

The online seminar will be projected in the room CM 1 517.

Links

Practical information

  • General public
  • Free
  • This event is internal

Organizer

  • EPFL SIAM Student Chapter

Contact

Tags

Optimal sampling optimization on manifolds active learning

Event broadcasted in

Share