This paper reports two proposals of possible preconditioners for the Nonlinear Conjugate Gradient (NCG) method, in large scale unconstrained optimization. On one hand, the common idea of our preconditioners is inspired to L-BFGS quasi–Newton updates, on the other hand we aim at explicitly approximating in some sense the inverse of the Hessian matrix. Since we deal with large scale optimization problems, we propose matrix–free approaches where the preconditioners are built using symmetric low–rank updating formulae. Our distinctive new contributions rely on using information on the objective function collected as by-product of the NCG, at previous iterations. Broadly speaking, our first approach exploits the secant equation, in order to impose interpolation conditions on the objective function. In the second proposal we adopt and ad hoc modified–secant approach, in order to possibly guarantee some additional theoretical properties.
2016, Numerical computations: theory and algorithms (NUMTA–2016). Proceedings of the 2nd international conference “Numerical computations: theory and agorithms”, Pages - (volume: 1776)
Preconditioning strategies for nonlinear conjugate gradient methods, based on quasi-Newton updates (04b Atto di convegno in volume)
Caliciotti Andrea, Fasano Giovanni, Roma Massimo
Gruppo di ricerca: Continuous Optimization