OSGA: A fast subgradient algorithm with optimal complexity

Arnold Neumaier

Publications: Contribution to journalArticlePeer Reviewed

Abstract

This paper presents an algorithm for approximately minimizing a convex function in simple, not necessarily bounded convex, finite-dimensional domains, assuming only that function values and subgradients are available. No global information about the objective function is needed apart from a strong convexity parameter (which can be put to zero if only convexity is known). The worst case number of iterations needed to achieve a given accuracy is independent of the dimension and—apart from a constant factor—best possible under a variety of smoothness assumptions on the objective function.
Original languageEnglish
Pages (from-to)1- 21
Number of pages21
JournalMathematical Programming
Volume158
Issue number1-2
Early online date16 May 2015
DOIs
Publication statusPublished - Jul 2016

Austrian Fields of Science 2012

  • 101016 Optimisation

Keywords

  • Complexity bound
  • Convex optimization
  • Optimal subgradient method
  • Large-scale optimization
  • Nesterov’s optimal method
  • Nonsmooth optimization
  • Optimal first-order method
  • Smooth optimization
  • Strongly convex
  • GRADIENT METHODS
  • SPARSE
  • MINIMIZATION
  • CONVEX
  • Nesterov's optimal method

Cite this