On The Convergence of Euler Discretization of Finite-Time Convergent Gradient Flows (2010.02990v5)
Abstract: In this study, we investigate the performance of two novel first-order optimization algorithms, namely the rescaled-gradient flow (RGF) and the signed-gradient flow (SGF). These algorithms are derived from the forward Euler discretization of finite-time convergent flows, comprised of non-Lipschitz dynamical systems, which locally converge to the minima of gradient-dominated functions. We first characterize the closeness between the continuous flows and the discretizations, then we proceed to present (linear) convergence guarantees of the discrete algorithms (in the general and the stochastic case). Furthermore, in cases where problem parameters remain unknown or exhibit non-uniformity, we further integrate the line-search strategy with RGF/SGF and provide convergence analysis in this setting. We then apply the proposed algorithms to academic examples and deep neural network training, our results show that our schemes demonstrate faster convergences against standard optimization alternatives.