Talk of Dr Alessandro Rudi (INRIA and École Normale Supérieure)

Thumbnail

Event details

Date 22.11.2019
Hour 11:1513:00
Speaker  Dr Alessandro Rudi
Location
Category Conferences - Seminars
Title:
Scaling up optimal kernel methods for large scale machine learning

Abstract:
Kernel methods provide a principled way to perform non linear, nonparametric learning. They rely on solid functional analytic foundations and enjoy optimal statistical properties. However, at least in their basic form, they have limited applicability in large scale scenarios because of stringent computational requirements in terms of time and especially memory. In this talk we take a substantial step in scaling up kernel methods analyzing novel algorithms techniques that allow to efficiently process millions of points. The algorithms are derived combining several algorithmic principles, namely stochastic subsampling, iterative solvers and preconditioning. Our theoretical analysis shows that optimal statistical accuracy is achieved requiring essentially O(n) memory and O(n sqrt(n)) time. An extensive experimental analysis on large scale datasets shows that, even with a single machine, the analyzed approach outperforms previous state of the art solutions, which exploit parallel/distributed architectures.

BIo:
 Alessandro Rudi is Researcher at INRIA and École Normale Supérieure, Paris. He received his PhD in 2014 from the University of Genova, after being a visiting student at the Center for Biological and Computational Learning at MIT. Between 2014 and 2017 he has been a postdoctoral fellow at Laboratory of Computational and Statistical Learning at Italian Institute of Technology and University of Genova.

Practical information

  • Expert
  • Registration required

Organizer

  • Professor Volkan Cevher

Contact

  • Gosia Baltaian

Event broadcasted in

Share