An accelerated minimax algorithm for convex-concave saddle point problems with nonsmooth coupling function

Publications: Contribution to journal โ€บ Article โ€บ Peer Reviewed

Abstract

In this work we aim to solve a convex-concave saddle point problem, where the convex-concave coupling function is smooth in one variable and nonsmooth in the other and not assumed to be linear in either. The problem is augmented by a nonsmooth regulariser in the smooth component. We propose and investigate a novel algorithm under the name of OGAProx, consisting of an optimistic gradient ascent step in the smooth variable coupled with a proximal step of the regulariser, and which is alternated with a proximal step in the nonsmooth component of the coupling function. We consider the situations convex-concave, convex-strongly concave and strongly convex-strongly concave related to the saddle point problem under investigation. Regarding iterates we obtain (weak) convergence, a convergence rate of order O(1/K ) and linear convergence like O(๐œƒ^K ) with ๐œƒ < 1, respectively. In terms of function values we obtain ergodic convergence rates of order O(1/K), O(1/K^2) and O(๐œƒ^K) with ๐œƒ < 1, respectively. We validate our theoretical considerations on a nonsmooth-linear saddle point problem, the training of multi kernel support vector machines and a classification problem incorporating minimax group fairness.
Original languageEnglish
Pages (from-to)925-966
Number of pages42
JournalComputational Optimization and Applications
Volume86
Issue number3
Early online date4 Jun 2022
DOIs
Publication statusPublished - 2023

Austrian Fields of Science 2012

  • 101016 Optimisation

Keywords

  • Acceleration
  • Convergence rate
  • Convex-concave
  • Linear convergence
  • Minimax algorithm
  • Saddle point problem

Fingerprint

Dive into the research topics of 'An accelerated minimax algorithm for convex-concave saddle point problems with nonsmooth coupling function'. Together they form a unique fingerprint.

Cite this