IC Colloquium: Kernel distances for distinguishing and sampling from probability distributions

Thumbnail

Event details

Date 18.02.2019
Hour 10:1511:15
Location
Category Conferences - Seminars
By: Dougal Sutherland - University College London
IC Faculty candidate

Abstract:
Probability distributions are the core object of statistical machine learning, and one of the core properties we can consider is distances between them. In this talk, we will consider using these distances for two important tasks, and show how to design distances which will be useful for each task. First, we study the problem of two-sample testing, where we wish to determine whether two different datasets meaningfully differ, and if so how they differ. We then study this framework in the setting of training generative models, such as generative adversarial networks (GANs), which learn to sample from complex distributions such as those of natural images.
 
The distances used are defined in terms of kernels, but we parameterize these kernels in as deep networks for flexibility. This combination gives both theoretical and practical benefits over staying purely in either framework, and we obtain state-of-the-art results for unsupervised image generation on CelebA and ImageNet with our novel Scaled MMD GAN.

Bio:
Dougal Sutherland is a postdoctoral researcher at the Gatsby Computational Neuroscience Unit, University College London, working with Arthur Gretton. He received his PhD in 2016 from Carnegie Mellon University, advised by Jeff Schneider. His research focuses on problems of learning about distributions from samples, including training implicit generative models, density estimation, two-sample testing, and distribution regression. His work combines kernel frameworks with deep learning, and aims for theoretical grounding of practical results.

More information

Practical information

  • General public
  • Free
  • This event is internal

Contact

  • Host: Martin Jaggi

Share