Conferences - Seminars

  Thursday 28 February 2019 10:15 - 11:15 BC 420

IC Colloquium: The Adaptive Complexity of Submodular Optimization: Exponentially Faster Algorithms for Machine Learning and Beyond

By: Eric Balkanski - Harvard University
IC Faculty candidate

Many methods in optimization are inherently sequential and consequently cannot be efficiently parallelized. In this talk, I will describe a novel approach called adaptive sampling that yields algorithms whose parallel running time is exponentially faster than any previous algorithm for submodular optimization.  Maximizing a submodular function is the algorithmic engine behind a growing number of machine learning applications such as speech and document summarization, recommendation systems, clustering, feature selection, and network analysis. 
The speedups are in the adaptive complexity model, which is an information theoretic abstraction for parallelism. The concept of adaptivity plays an important role in a broader agenda of understanding the limitations and possibilities of data-driven decision making. In this talk I will first introduce adaptivity, then describe the adaptive sampling algorithms, and finally present experimental results from various application domains.

Eric is a fifth year PhD student in Computer Science at Harvard University where he is advised by Yaron Singer. Before his PhD, he received his B.S. from Carnegie Mellon University. He is the recipient of a Google PhD fellowship, a Smith Family Graduate Science and Engineering Fellowship, a Best Paper Award at the 18th Conference on Implementation and Application of Automata in 2013, and is also an Andrew Carnegie Society Scholar.  His research interests include machine learning, optimization, algorithms, networks, and mechanism design.

More information

Contact Host: Ola Svensson

Accessibility General public

Admittance Free

This event is internal