Abstract
This paper presents an algorithm for approximately minimizing a convex function in simple, not necessarily bounded convex, finite-dimensional domains, assuming only that function values and subgradients are available. No global information about the objective function is needed apart from a strong convexity parameter (which can be put to zero if only convexity is known). The worst case number of iterations needed to achieve a given accuracy is independent of the dimension and—apart from a constant factor—best possible under a variety of smoothness assumptions on the objective function.
Original language | English |
---|---|
Pages (from-to) | 1- 21 |
Number of pages | 21 |
Journal | Mathematical Programming |
Volume | 158 |
Issue number | 1-2 |
Early online date | 16 May 2015 |
DOIs | |
Publication status | Published - Jul 2016 |
Austrian Fields of Science 2012
- 101016 Optimisation
Keywords
- Complexity bound
- Convex optimization
- Optimal subgradient method
- Large-scale optimization
- Nesterov’s optimal method
- Nonsmooth optimization
- Optimal first-order method
- Smooth optimization
- Strongly convex
- GRADIENT METHODS
- SPARSE
- MINIMIZATION
- CONVEX
- Nesterov's optimal method