Conferences - Seminars
Recent Developments in Linear Regression
By Prof. Martin Jaeggi (EPFL, IC, INFCOM, MLO)
We will overview some recent advances in optimization algorithms used for the training of generalized linear models (regression, classification). While in machine learning, the main work-horse algorithm is stochastic gradient descent, its lesser known cousin, coordinate descent, also has a lot of interesting properties. After a brief introduction, we will discuss distributed algorithm variants as well as faster convergence by means of (constant or adaptive) importance sampling.
Organization Prof. Daniel Kressner
Contact Prof. Daniel Kressner
Accessibility Informed public