Emergent Mind

Hessian Initialization Strategies for L-BFGS Solving Non-linear Inverse Problems

(2103.10010)
Published Mar 18, 2021 in math.NA , cs.NA , and math.OC

Abstract

L-BFGS is the state-of-the-art optimization method for many large scale inverse problems. It has a small memory footprint and achieves superlinear convergence. The method approximates Hessian based on an initial approximation and an update rule that models current local curvature information. The initial approximation greatly affects the scaling of a search direction and the overall convergence of the method. We propose a novel, simple, and effective way to initialize the Hessian. Typically, the objective function is a sum of a data-fidelity term and a regularizer. Often, the Hessian of the data-fidelity is computationally challenging, but the regularizer's Hessian is easy to compute. We replace the Hessian of the data-fidelity with a scalar and keep the Hessian of the regularizer to initialize the Hessian approximation at every iteration. The scalar satisfies the secant equation in the sense of ordinary and total least squares and geometric mean regression. Our new strategy not only leads to faster convergence, but the quality of the numerical solutions is generally superior to simple scaling based strategies. Specifically, the proposed schemes based on ordinary least squares formulation and geometric mean regression outperform the state-of-the-art schemes. The implementation of our strategy requires only a small change of a standard L-BFGS code. Our experiments on convex quadratic problems and non-convex image registration problems confirm the effectiveness of the proposed approach.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.