Abstract
We consider the minimization of a convex objective function subject to the set of minima of another convex function, under the assumption that both functions are twice continuously differentiable. We approach this optimization problem from a continuous perspective by means of a second-order dynamical system with Hessian-driven damping and a penalty term corresponding to the constrained function. By constructing appropriate energy functionals, we prove weak convergence of the trajectories generated by this differential equation to a minimizer of the optimization problem as well as convergence for the objective function values along the trajectories. The performed investigations rely on Lyapunov analysis in combination with the continuous version of the Opial Lemma. In case the objective function is strongly convex, we can even show strong convergence of the trajectories.
Original language | English |
---|---|
Pages (from-to) | 1265-1277 |
Number of pages | 13 |
Journal | Optimization |
Volume | 68 |
Issue number | 7 |
Early online date | 21 Mar 2018 |
DOIs | |
Publication status | Published - 2019 |
Austrian Fields of Science 2012
- 101002 Analysis
- 101016 Optimisation
- 101027 Dynamical systems
Keywords
- ASYMPTOTIC-BEHAVIOR
- CONVEX MINIMIZATION
- Dynamical systems
- EQUATIONS
- FORWARD-BACKWARD
- Lyapunov analysis
- MAXIMAL MONOTONE-OPERATORS
- Newton dynamics
- OPTIMIZATION
- PENALIZATION
- SCHEMES
- convex optimization
- nonautonomous systems