We propose a novel parallel asynchronous algorithmic framework for the minimization of the sum of a smooth (nonconvex) function and a convex (nonsmooth) regularizer. The framework hinges on Successive Convex Approximation (SCA) techniques and on a novel probabilistic model which describes in a unified way a variety of asynchronous settings in a more faithful and exhaustive way with respect to state-of-the-art models. Key features of our framework are: i) it accommodates inconsistent read, meaning that components of the variables may be written by some cores while being simultaneously read by others; ii) it covers in a unified way several existing methods; and iii) it accommodates a variety of parallel computing architectures. Almost sure convergence to stationary solutions is proved for the general case, and iteration complexity analysis is given for a specific version of our model. Numerical results show that our scheme outperforms existing asynchronous ones.
2017, 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Pages 4706-4710
Asynchronous parallel nonconvex large-scale optimization (04b Atto di convegno in volume)
Cannelli L., Facchinei F., Kungurtsev V., Scutari G.
Gruppo di ricerca: Continuous Optimization