TY - JOUR
T1 - An accelerated minimax algorithm for convex-concave saddle point problems with nonsmooth coupling function
AU - Bot, Radu Ioan
AU - Csetnek, Ernö Robert
AU - Sedlmayer, Michael
PY - 2023
Y1 - 2023
N2 - In this work we aim to solve a convex-concave saddle point problem, where the convex-concave coupling function is smooth in one variable and nonsmooth in the other and not assumed to be linear in either. The problem is augmented by a nonsmooth regulariser in the smooth component. We propose and investigate a novel algorithm under the name of OGAProx, consisting of an optimistic gradient ascent step in the smooth variable coupled with a proximal step of the regulariser, and which is alternated with a proximal step in the nonsmooth component of the coupling function. We consider the situations convex-concave, convex-strongly concave and strongly convex-strongly concave related to the saddle point problem under investigation. Regarding iterates we obtain (weak) convergence, a convergence rate of order O(1/K ) and linear convergence like O(휃^K ) with 휃 < 1, respectively. In terms of function values we obtain ergodic convergence rates of order O(1/K), O(1/K^2) and O(휃^K) with 휃 < 1, respectively. We validate our theoretical considerations on a nonsmooth-linear saddle point problem, the training of multi kernel support vector machines and a classification problem incorporating minimax group fairness.
AB - In this work we aim to solve a convex-concave saddle point problem, where the convex-concave coupling function is smooth in one variable and nonsmooth in the other and not assumed to be linear in either. The problem is augmented by a nonsmooth regulariser in the smooth component. We propose and investigate a novel algorithm under the name of OGAProx, consisting of an optimistic gradient ascent step in the smooth variable coupled with a proximal step of the regulariser, and which is alternated with a proximal step in the nonsmooth component of the coupling function. We consider the situations convex-concave, convex-strongly concave and strongly convex-strongly concave related to the saddle point problem under investigation. Regarding iterates we obtain (weak) convergence, a convergence rate of order O(1/K ) and linear convergence like O(휃^K ) with 휃 < 1, respectively. In terms of function values we obtain ergodic convergence rates of order O(1/K), O(1/K^2) and O(휃^K) with 휃 < 1, respectively. We validate our theoretical considerations on a nonsmooth-linear saddle point problem, the training of multi kernel support vector machines and a classification problem incorporating minimax group fairness.
KW - Acceleration
KW - Convergence rate
KW - Convex-concave
KW - Linear convergence
KW - Minimax algorithm
KW - Saddle point problem
UR - https://arxiv.org/abs/2104.06206
UR - http://www.scopus.com/inward/record.url?scp=85131581793&partnerID=8YFLogxK
U2 - 10.1007/s10589-022-00378-8
DO - 10.1007/s10589-022-00378-8
M3 - Article
SN - 0926-6003
VL - 86
SP - 925
EP - 966
JO - Computational Optimization and Applications
JF - Computational Optimization and Applications
IS - 3
ER -