09/07/2020

Near-Optimal Methods for Minimizing Star-Convex Functions and Beyond

Oliver Hinder, Aaron Sidford, Nimit S Sohoni

Keywords: Non-convex optimization,

Abstract: In this paper, we provide near-optimal accelerated first-order methods for minimizing a broad class of smooth nonconvex functions that are unimodal on all lines through a minimizer. This function class, which we call the class of smooth quasar-convex functions, is parameterized by a constant $$\gamma \in (0,1]$$: $$\gamma = 1$$ encompasses the classes of smooth convex and star-convex functions, and smaller values of $$\gamma$$ indicate that the function can be "more nonconvex." We develop a variant of accelerated gradient descent that computes an $$\epsilon$$-approximate minimizer of a smooth $$\gamma$$-quasar-convex function with at most $$O(\gamma^{-1} \epsilon^{-1/2} \log(\gamma^{-1} \epsilon^{-1}))$$ total function and gradient evaluations. We also derive a lower bound of $$\Omega(\gamma^{-1} \epsilon^{-1/2})$$ on the worst-case number of gradient evaluations required by any deterministic first-order method, showing that, up to a logarithmic factor, no deterministic first-order method can improve upon ours.

 0
 0
 0
 0
This is an embedded video. Talk and the respective paper are published at COLT 2020 virtual conference. If you are one of the authors of the paper and want to manage your upload, see the question "My papertalk has been externally embedded..." in the FAQ section.

Comments

Post Comment
no comments yet
code of conduct: tbd Characters remaining: 140

Similar Papers

 16:25