Gradient Based Optimization Algorithms

Thumbnail

Event details

Date 04.02.2014
Hour 11:1512:15
Speaker Prof. Marc Teboulle, Tel-Aviv University
Bio: Marc Teboulle is a Professor at the School of Mathematical Sciences of Tel-Aviv University. He received his D.Sc. from the Technion, Israel Institute of Technology in 1985. He has held a position of applied mathematician at the Israel Aircraft Industries, and academic appointements at Dalhousie University and the University of Maryland. He has also held visiting appointments at several institutions, including IMPA (Rio de Janeiro), the University of Montpellier, the University of Paris, University of Lyon, the University of Michigan, the University of Texas at Austin, The National Sun Yat-sen University, Kaohsiung, Taiwan, Republic of China, and The University of California, Los Angeles.

Teboulle research interests are in the area of continuous optimization, including theory, algorithmic analysis and its applications. He has published numerous papers and two books, and has given invited lectures at many international conferences. His research has been supported by various funding agencies including, the National Science Foundation, the French-Israeli Ministry of Sciences, the Bi-National Israel-United States Science Foundation, and the Israel Science Foundation. He currently serves as the Area Editor for Continuous Optimization of Mathematics of Operations Research; and on the editorial boards of the European Series in Applied and Industrial Mathematics, Control, Optimisation and Calculus of Variations, Journal of Optimization Theory and Applications, and Science China Mathematics. During 1999-2002, he served as chairman of the Department of Statistics and Operations Research at the School of Mathematical Sciences of Tel-Aviv University.
Location
Category Conferences - Seminars
Optimization methods applied to problems in image processing, machine learning and other fields give rise to very large scale, often nonsmooth and even nonconvex optimization models. This leads to challenging theoretical and computational difficulties which rule out the use of most of the sophisticated algorithms, such as interior point,  since the number of arithmetic operations required in a single iteration is already prohibitively large. Elementary first order methods (using function values and gradient/subgradient information) then often remain our best alternative to tackle such problems.

This talk surveys some of our recent results on the design and analysis of first-order algorithms for various generic optimization models arising in a wide variety of applications, highlighting the ways in which problem structures and data information can be beneficially exploited to devise simple and efficient algorithms.

Practical information

  • General public
  • Free

Organizer

  • Laboratory for Information and Inference Systems (LIONS)

Contact

  • Gosia Baltaian

Event broadcasted in

Share