Abstract: |
This talk focuses on analyzing and developing accelerated optimization methods for the class of smooth and strongly convex functions. We derive a family of optimal gradient-based methods (named OERKD) for quadratic programming by building a mapping between continuous dynamics and discrete updates of optimization algorithms. Optimality of convergent rates is proved by analysis of explicit Runge-Kutta methods on the gradient flow equation with linear stability conditions. Experiments demonstrate the effectiveness of the proposed algorithm even on classical nonlinear problems. A noteworthy byproduct is proving the asymptotic equivalence between OERKD and Polyak`s heavy ball method, which subtly bridges two primary integration schemes. |
|