Optimal subgradient algorithms for large-scale convex optimization in simple domains

Masoud Ahookhosh (Korresp. Autor*in), Arnold Neumaier

Veröffentlichungen: Beitrag in FachzeitschriftArtikelPeer Reviewed

Abstract

This paper describes two optimal subgradient algorithms for solving structured large-scale convex constrained optimization. More specifically, the first algorithm is optimal for smooth problems with Lipschitz continuous gradients and for Lipschitz continuous nonsmooth problems, and the second algorithm is optimal for Lipschitz continuous nonsmooth problems. In addition, we consider two classes of problems: (i) a convex objective with a simple closed convex domain, where the orthogonal projection onto this feasible domain is efficiently available; and (ii) a convex objective with a simple convex functional constraint. If we equip our algorithms with an appropriate prox-function, then the associated subproblem can be solved either in a closed form or by a simple iterative scheme, which is especially important for large-scale problems. We report numerical results for some applications to show the efficiency of the proposed schemes.
OriginalspracheEnglisch
Seiten (von - bis)1071-1097
Seitenumfang27
FachzeitschriftNumerical Algorithms
Jahrgang76
Ausgabenummer4
Frühes Online-Datum14 März 2017
DOIs
PublikationsstatusVeröffentlicht - Dez. 2017

ÖFOS 2012

  • 101016 Optimierung

Zitationsweisen