BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Memento EPFL//
BEGIN:VEVENT
SUMMARY:Promises and Challenges of Machine Learning in Scientific Computin
 g
DTSTART:20210415T161500
DTEND:20210415T171500
DTSTAMP:20260405T113902Z
UID:831f05efdd9ca93b7649ef66d9d0691b7be803143fee42ad0d0c8947
CATEGORIES:Conferences - Seminars
DESCRIPTION:Prof. Éric Vanden-Eijden (Courant Institute\, NYU)\nThe recen
 t success of machine learning suggests that neural networks may be capable
  of approximating high-dimensional functions with controllably small error
 s. As a result\, they could outperform standard function interpolation met
 hods that have been the workhorses of scientific computing but do not scal
 e well with dimension. In support of this prospect\, I will first discuss 
 recent advances about the trainability and accuracy of shallow neural netw
 orks — these offer the simplest instance of nonlinear learning in functi
 onal spaces that are fundamentally different from classic approximation sp
 aces. The dynamics of training in these spaces can be analyzed using tools
  from optimal transport and statistical mechanics\, which reveal when and 
 how shallow neural networks can overcome the curse of dimensionality and h
 int at ways to also analyze deep architectures.\n\nI will then discuss how
  scientific computing problem in high-dimension once thought intractable c
 an be revisited through the lens of these results\, by focusing on two app
 lications: the solution of partial differential equations in variational f
 orm\, and the sampling of high dimensional distributions. In these scienti
 fic applications\, the main issue to address is one that does not arise in
  traditional supervised learning\, namely how to acquire the data needed t
 o estimate the loss and make sure that the trained networks have good gene
 ralization properties.
LOCATION:https://epfl.zoom.us/j/84030108577?pwd=bHh2Z3J2YllvTWdteHA3MHhVcn
 IyUT09
STATUS:CONFIRMED
END:VEVENT
END:VCALENDAR
