09/07/2020

From Nesterov's Estimate Sequence to Riemannian Acceleration

Kwangjun Ahn, Suvrit Sra

Keywords: Non-convex optimization, Approximation algorithms, Convex optimization

Abstract: We propose the first global accelerated gradient method for Riemannian manifolds. Toward establishing our results, we revisit Nesterov's estimate sequence technique and develop a conceptually simple alternative from first principles. We then extend our analysis to Riemannian acceleration, localizing the key difficulty into ``metric distortion.'' We control this distortion via a novel geometric inequality, which enables us to formulate and analyze global Riemannian acceleration.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at COLT 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd

Similar Papers