Revisiting High-Resolution ODEs for Faster Convergence Rates

Date:


In this talk, we analyze first-order accelerated methods for the unconstrained minimization of convex and strongly convex smooth functions from a continuous time perspective. We use high-resolution ordinary differential equations (ODEs) to understand the behavior of Nesterov’s accelerated gradient algorithm. We propose a new general first-order acelerated method for strongly convex functions. For convex functions, we exploit the variational perspective on high-resolution ODEs. Our analysis improves on several convergence rates.