03/08/2020

Amortized Nesterov’s Momentum: A Robust Momentum and Its Application to Deep Learning

Kaiwen Zhou, Yanghua Jin, Qinghua Ding, James Cheng

Keywords:

Abstract: This work proposes a novel momentum technique, the Amortized Nesterov’s Momentum, for stochastic convex optimization. The proposed method can be regarded as a smooth transition between Nesterov’s method and mirror descent. By tuning only a single parameter, users can trade Nesterov’s acceleration for robustness, that is, the variance control of the stochastic noise. Motivated by the recent success of using momentum in deep learning, we conducted extensive experiments to evaluate this new momentum in deep learning tasks. The results suggest that it can serve as a favorable alternative for Nesterov’s momentum.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at UAI 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd

Similar Papers