A proximal-gradient inertial algorithm with Tikhonov regularization: strong convergence to the minimal norm solution (2407.10350v1)
Abstract: We investigate the strong convergence properties of a proximal-gradient inertial algorithm with two Tikhonov regularization terms in connection to the minimization problem of the sum of a convex lower semi-continuous function $f$ and a smooth convex function $g$. For the appropriate setting of the parameters we provide strong convergence of the generated sequence $(x_k)$ to the minimum norm minimizer of our objective function $f+g$. Further, we obtain fast convergence to zero of the objective function values in a generated sequence but also for the discrete velocity and the sub-gradient of the objective function. We also show that for another settings of the parameters the optimal rate of order $\mathcal{O}(k{-2})$ for the potential energy $(f+g)(x_k)-\min(f+g)$ can be obtained.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.