12/07/2020

Random extrapolation for primal-dual coordinate descent

Ahmet Alacaoglu, Olivier Fercoq, Volkan Cevher

Keywords: Optimization - Convex

Abstract: We introduce a randomly extrapolated primal-dual coordinate descent method that automatically adapts to the sparsity of the data matrix as well as the favorable structures of the objective function in optimization. Our method can update only a subset of primal and dual variables with sparse data, and it can provably use large step sizes with dense data, retaining the benefits of the specific methods designed for each case. In addition to key adaptivity to the sparsity, our method attains fast convergence guarantees in favorable cases \textit{without any modifications}. In particular, we prove linear convergence under metric subregularity, which applies to strongly convex-strongly concave problems, linear programs and piecewise linear quadratic functions. We also show almost sure convergence of the sequence and optimal sublinear convergence rates for the primal-dual gap and objective values in the worst case. Numerical evidence demonstrates the state-of-the-art empirical performance of our method in sparse and dense settings, matching and improving the existing methods over different applications with real data.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at ICML 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers