Coordinate descent algorithms for convex optimization

Thumbnail

Event details

Date 09.08.2017
Hour 10:0012:00
Speaker Sai Praneeth Karimireddy 
Location
Category Conferences - Seminars

EDIC candidacy exam
Exam president: Prof. Volkan Cevher
Thesis advisor: Prof. Martin Jaggi
Co-examiner: Prof. Daniel Kuhn

Abstract
Coordinate descent algorithms are the workhorse of convex optimization in practice since they are easy to implement, fast, and hard to beat. We want to design even faster coordinate descent algorithms which take geometry, communication costs etc. into account.

Background papers
Efficiency of coordinate descent methods on huge-scale optimization problems, by Nesterov, Yu - SIAM Journal on Optimization 22.2 (2012): 341-362.
SDNA: Stochastic Dual Newton Ascent for Empirical Risk Minimization, by Qu, Zheng, et al. - Proceedings of The 33rd International Conference on Machine Learning. 2016.  
Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems, by Nesterov, Yurii, and Sebastian U. Stich. SIAM Journal on Optimization 27.1 (2017): 110-123.

 

Practical information

  • General public
  • Free

Contact

Tags

e

Share