We consider nonconvex constrained optimization problems and propose a newapproach to the convergence analysis based on penalty functions. We make use of classicalpenalty functions in an unconventional way, in that penalty functions only enter in thetheoretical analysis of convergence while the algorithm itself is penalty free. Based on thisidea, we are able to establish several new results, including thefirst general analysis fordiminishing stepsize methods in nonconvex, constrained optimization, showing con-vergence to generalized stationary points, and a complexity study for sequential quadraticprogramming–type algorithms.
2021, MATHEMATICS OF OPERATIONS RESEARCH, Pages 595-627 (volume: 46)
Ghost Penalties in Nonconvex Constrained Optimization: Diminishing Stepsizes and Iteration Complexity (01a Articolo in rivista)
Facchinei Francisco, Kungurtsev Vyacheslav, Lampariello Lorenzo, Scutari Gesualdo