Inaugural Lectures - Prof. Martin Jaggi and Prof. Michael Kapralov

Thumbnail

Event details

Date 24.05.2023
Hour 17:0018:30
Speaker Prof. Martin Jaggi, Prof. Michael Kapralov
Location
Category Inaugural lectures - Honorary Lecture
Event Language English
Date: 24 May 2023

Program: 
  • 17:00-17:05: Introduction by Prof. Rüdiger Urbanke, Dean of the IC School
  • 17:05-17:35: Inaugural Lecture Prof. Martin Jaggi
  • 17:35-17:45: Q & A
  • 17:45-17:50: Introduction by Prof. Rüdiger Urbanke, Dean of the IC School
  • 17:50-18:20: Inaugural Lecture Prof. Michael Kapralov
  • 18:20-18:30: Q & A
  • 18:30-20:00: Apéritif in the hall outside SG1 (bottom floor)
Location:  SG1

Registration: Click here

****************************************************************

Prof. Martin Jaggi

A Brief Journey through Machine Learning and Artificial Intelligence

Abstract
Algorithms that learn from data are fascinating and have become increasingly skillful, not only since ChatGPT and image generation. We will discuss how such machine learning algorithms work, and how this research topic has evolved over recent years. When learning from increasingly large datasets, efficient training can become difficult, and poses privacy risks on personal data. New and improved algorithms are required to address these challenges, which is the focus of our research group at EPFL.
 
About the speaker
Martin Jaggi is an Associate Professor at EPFL, heading the Machine Learning and Optimization Laboratory. Before joining EPFL, he was a post-doctoral researcher at ETH Zurich, at the Simons Institute in Berkeley, and at École Polytechnique in Paris. He earned his PhD in Machine Learning and Optimization from ETH Zurich in 2011, and a MSc in Mathematics also from ETH Zurich. He is a co-founder of EPFL's Applied Machine Learning Days, and a Fellow of the European ELLIS network.

****************************************************************

Prof. Michael Kapralov

Sublinear Algorithms

Abstract
As the sizes of modern datasets grow, many classical polynomial time, and sometimes even linear time, algorithms become prohibitively expensive: the input is often too large to be stored in the memory of a single compute node, is hard to partition among nodes in a cluster to avoid communication bottlenecks or is very expensive to acquire in the first place. Thus, processing of such datasets requires a new set of algorithmic tools for computing with extremely constrained resources. I will talk about sublinear algorithms, a class of algorithms whose resource requirements are substantially smaller than the size of the input that they operate on, making them a perfect fit for large data analysis. I will focus on some highlights of our recent results on computing in the sublinear regime, and then discuss some exciting directions for future work.

About the speaker
Michael Kapralov is an Associate Professor at EPFL’s School of Communication and Computer Sciences. He completed his PhD at Stanford, then spent two years as a postdoc at MIT, and a year at IBM as a Goldstine Postdoctoral Fellow. Michael is broadly interested in theoretical computer science, with an emphasis on theoretical foundations of big data analysis. Most of his algorithmic work is in sublinear algorithms, where specific directions include streaming, sketching, sparse recovery and Fourier sampling.

Practical information

  • Informed public
  • Registration required

Organizer

Tags

inaugural lecture Martin Jaggi Michael Kapralov computer science IC School

Share