Emergent Mind

Abstract

In this work we propose two Hermite-type optimization methods, Hermite least squares and Hermite BOBYQA, specialized for the case that some partial derivatives of the objective function are available and others are not. The main objective is to reduce the number of objective function calls by maintaining the convergence properties. Both methods are modifications of Powell's derivative-free BOBYQA algorithm. But instead of (underdetermined) interpolation for building the quadratic subproblem in each iteration, the training data is enriched with first and -- if possible -- second order derivatives and then (weighted) least squares regression is used. Proofs for global convergence are discussed and numerical results are presented. Further, the applicability is verified for a realistic test case in the context of yield optimization. Numerical tests show that the Hermite least squares approach outperforms classic BOBYQA if half or more partial derivatives are available. In addition, the Hermite-type approaches achieve more robustness and thus better performance in case of noisy objective functions.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.