Conditions for linear convergence of (stochastic) first order methods


Date and Time: 

Tuesday, October 10, 2017 - 4:00pm

Event Location: 

Mohler Lab, Room 453

For convex optimization problems deterministic first order methods have linear convergence provided that the objective function is smooth (Lipschitz continuous gradient) and strongly convex. Moreover, under the same conditions – smoothness and strong convexity – stochastic first order methods have sublinear convergence rates. However, in many applications (machine learning, statistics, control, signal processing) the smoothness/strong convexity conditions do not hold; but the objective function still has a special structure (e.g. composition of a strongly convex function with a linear map). In this talk we replace the smoothness/strong convexity assumptions with several other conditions, that are less conservative, for which we prove that several (stochastic) first order methods are converging linearly. We also provide necessary conditions for linear convergence of (stochastic) gradient method. Finally, we provide examples of several functional classes satisfying our new conditions and discuss several applications of these results (Lasso problem, linear systems, linear programming, convex feasibility, etc).

Bio Sketch: 

Ion Necoara received the MSc degree in optimization and statistics from University of Bucharest, Romania in 2002 and the PhD degree in applied sciences (cum laude) from Delft University of Technology, The Netherlands in 2006. For the period 2007-2009, he completed a postdoctoral fellowship at the Katholieke Universiteit Leuven, Belgium. Currently, he is a professor at the Department of Automatic Control and Systems Engineering, University Politehnica Bucharest, Romania. He is author of several research papers in optimization and control, for which he received several awards: Best Paper Award from Journal of Global Optimization 2015; Romanian Academy Award 2015; Excellence in Research award from Ad Astra 2016. His research interests cover various topics in developing new optimization algorithms with a focus on structure exploiting and applying optimization techniques for developing new advanced design algorithms for complex systems. More details can be found on his webpage: