Home » Publication » 27019

Dettaglio pubblicazione

2020, OPTIMIZATION METHODS & SOFTWARE, Pages 502-520 (volume: 35)

An Alternating Augmented Lagrangian method for constrained nonconvex optimization (01a Articolo in rivista)

Galvan G., Lapucci M., Levato T., Sciandrone M.

We consider the problem of minimizing a smooth nonconvex function over a structured convex feasible set, that is, defined by two sets of constraints that are easy to treat when considered separately. In order to exploit the structure of the problem, we define an equivalent formulation by duplicating the variables and we consider the augmented Lagrangian of this latter formulation. Following the idea of the Alternating Direction Method of Multipliers (ADMM), we propose an algorithm where a two-blocks decomposition method is embedded within an augmented Lagrangian framework. The peculiarities of the proposed algorithm are the following: (1) the computation of the exact solution of a possibly nonconvex subproblem is not required; (2) the penalty parameter is iteratively updated once an approximated stationary point of the augmented Lagrangian is determined. Global convergence results are stated under mild assumptions and without requiring convexity of the objective function. Although the primary aim of the paper is theoretical, we perform numerical experiments on a nonconvex problem arising in machine learning, and the obtained results show the practical advantages of the proposed approach with respect to classical ADMM.
keywords
© Università degli Studi di Roma "La Sapienza" - Piazzale Aldo Moro 5, 00185 Roma