Proximal Envelopes

Event details
Date | 11.12.2015 |
Hour | 10:15 › 11:15 |
Speaker | Panos Patrinos |
Location | |
Category | Conferences - Seminars |
Quasi-Newton operator splitting methods for solving nonsmooth optimization problems
We show that operator splitting techniques for solving optimization problems, such as Forward-Backward Splitting (FBS), Douglas-Rachford Splitting (DRS) and ADMM, can be interpreted as scaled gradient methods applied to the unconstrained minimization of a continuously differentiable function. Inspired by the connection between the proximal minimization algorithm and the Moreau envelope, we call these functions Forward-Backward and Douglas-Rachford envelope. The new interpretation paves the way of devising new algorithms for composite and separable nonsmooth optimization problems, by using ideas from Newton-like methods for unconstrained smooth optimization.
We present applications of the proposed theory:
First, a Forward-Backward Quasi-Newton method with asymptotic superlinear convergence rate, suitable for medium scale applications like Model Predictive Control.
Second, Forward-Backward L-BFGS algorithms with complexity guarantees for large-scale nonsmooth optimization problems.
Finally, we derive complexity estimates for DRS and an accelerated version of DRS and ADMM.
Bio: Panagiotis (Panos) Patrinos is currently an assistant professor at the Department of Electrical Engineering in KU Leuven. During fall/winter 2014 he held a visiting professor position in the department of electrical engineering at Stanford University. He received his Ph.D. in Control and Optimization, M.Sc. in Applied Mathematics and M.Eng., all from National Technical University of Athens. After receiving his PhD he was a postdoctoral fellow at the University of Trento. In spring 2012 he became an assistant professor at IMT Institute for Advanced Studies Lucca, Italy.
He is the author of more than 40 papers in journals and refereed conference proceedings.His current research interests are focused on efficient algorithms and modeling environments for large-scale distributed and embedded optimization with applications in control of dynamical systems, high-dimensional statistics, machine learning and data mining. He is also interested in stochastic and risk-averse optimization with applications in the energy and power systems domain.
We show that operator splitting techniques for solving optimization problems, such as Forward-Backward Splitting (FBS), Douglas-Rachford Splitting (DRS) and ADMM, can be interpreted as scaled gradient methods applied to the unconstrained minimization of a continuously differentiable function. Inspired by the connection between the proximal minimization algorithm and the Moreau envelope, we call these functions Forward-Backward and Douglas-Rachford envelope. The new interpretation paves the way of devising new algorithms for composite and separable nonsmooth optimization problems, by using ideas from Newton-like methods for unconstrained smooth optimization.
We present applications of the proposed theory:
First, a Forward-Backward Quasi-Newton method with asymptotic superlinear convergence rate, suitable for medium scale applications like Model Predictive Control.
Second, Forward-Backward L-BFGS algorithms with complexity guarantees for large-scale nonsmooth optimization problems.
Finally, we derive complexity estimates for DRS and an accelerated version of DRS and ADMM.
Bio: Panagiotis (Panos) Patrinos is currently an assistant professor at the Department of Electrical Engineering in KU Leuven. During fall/winter 2014 he held a visiting professor position in the department of electrical engineering at Stanford University. He received his Ph.D. in Control and Optimization, M.Sc. in Applied Mathematics and M.Eng., all from National Technical University of Athens. After receiving his PhD he was a postdoctoral fellow at the University of Trento. In spring 2012 he became an assistant professor at IMT Institute for Advanced Studies Lucca, Italy.
He is the author of more than 40 papers in journals and refereed conference proceedings.His current research interests are focused on efficient algorithms and modeling environments for large-scale distributed and embedded optimization with applications in control of dynamical systems, high-dimensional statistics, machine learning and data mining. He is also interested in stochastic and risk-averse optimization with applications in the energy and power systems domain.
Practical information
- General public
- Free