IC Colloquium : Bayesian methodologies for efficient data analysis

Event details
Date | 01.02.2016 |
Hour | 10:15 › 11:30 |
Location | |
Category | Conferences - Seminars |
By : Mijung Park - University of Amsterdam
IC Faculty candidate
Abstract :
Machine learning and data science can greatly benefit from Bayesian methodologies, not only because they improve generalisation performance compared to point estimates that are prone to overfitting, but also they provide efficient and principled ways to solve a broad range of statistical problems. In this talk, I will describe several concrete examples where using Bayesian approaches greatly benefit in tackling problems occurring in many areas of science. These examples include (a) designing priors using domain knowledge for structurally sparse high-dimensional parameters with application to functional neuroimaging data and neural spike data; (b) Bayesian manifold learning that enables evaluating the quality of estimated latent manifold as well as learning the latent dimension from statistical evidence; and (c) approximate Bayesian computation (ABC) for models with intractable likelihoods, where we employ kernel mean embeddings to measure data similarities, which is an essential step in ABC.
Bio :
Mijung Park completed her PhD in the department of Electrical and Computer Engineering under the supervision of Prof. Jonathan Pillow and Prof. Al Bovik at The University of Texas at Austin. She was a postdoctoral research fellow working with Prof. Maneesh Sahani at the Gatsby computational neuroscience unit at University College London. Currently, she is a postdoctoral research fellow working with Prof. Max Welling in the informatics institute at University of Amsterdam.
More information
IC Faculty candidate
Abstract :
Machine learning and data science can greatly benefit from Bayesian methodologies, not only because they improve generalisation performance compared to point estimates that are prone to overfitting, but also they provide efficient and principled ways to solve a broad range of statistical problems. In this talk, I will describe several concrete examples where using Bayesian approaches greatly benefit in tackling problems occurring in many areas of science. These examples include (a) designing priors using domain knowledge for structurally sparse high-dimensional parameters with application to functional neuroimaging data and neural spike data; (b) Bayesian manifold learning that enables evaluating the quality of estimated latent manifold as well as learning the latent dimension from statistical evidence; and (c) approximate Bayesian computation (ABC) for models with intractable likelihoods, where we employ kernel mean embeddings to measure data similarities, which is an essential step in ABC.
Bio :
Mijung Park completed her PhD in the department of Electrical and Computer Engineering under the supervision of Prof. Jonathan Pillow and Prof. Al Bovik at The University of Texas at Austin. She was a postdoctoral research fellow working with Prof. Maneesh Sahani at the Gatsby computational neuroscience unit at University College London. Currently, she is a postdoctoral research fellow working with Prof. Max Welling in the informatics institute at University of Amsterdam.
More information
Practical information
- General public
- Free
- This event is internal
Contact
- Host : Katerina Argyraki