Zoltan Szabo - Consistency of Orlicz Random Fourier Features
Event details
| Date | 23.09.2019 |
| Hour | 11:15 › 12:00 |
| Speaker | Zoltán Szabó (http://www.cmap.polytechnique.fr/~zoltan.szabo/) is a Research Associate Professor at the Center of Applied Mathematics (CMAP), École Polytechnique, France. His main research interests are kernel methods, information theory, randomized algorithms and their applications. He serves/served as an Area Chair at ICML (2019, 2018, 2017), AISTATS (2020, 2019, 2018, 2017), NeurIPS (2018), UAI (2020, 2017, 2016) and IJCAI (2019), a Senior Area Chair at NeurIPS (2019), the Program Chair of the Data Science Summer School (DS^3-2019, 2018, 2017), and he is the moderator of statistical machine learning (stat.ML) on arXiv. |
| Location | |
| Category | Conferences - Seminars |
Kernel techniques provide highly flexible tools with successful applications at virtually all sub-fields of machine learning and statistics. The random Fourier feature approach (RFF) is probably the most widely-applied and popular idea to combine this representational power of kernels with computational efficiency; it won the 10-year test-of-time award at NIPS-2017. While the RFF technique is typically used in case of tasks expressed via function values (such as kernel ridge regression), in numerous applications taking into account high-order derivatives turns out to be beneficial; examples include nonlinear feature selection or fitting infinite-dimensional exponential family distributions. Despite its practical success, the theoretical understanding of RFFs in case of derivatives is rather limited. In this talk, I will show how a finite alpha-exponential Orlicz norm assumption allows one to get consistent RFF approximations in case of high-order derivatives, covering for example the popular inverse multiquadric (alpha = 1) or the Gaussian kernel (alpha = 2). This is a joint work with Linda Chamakh and Emmanuel Gobet.
Links
Practical information
- General public
- Free
Organizer
- Martin Jaggi