Emergent Mind

Deep Neural Network Accelerated Implicit Filtering

(2105.08883)
Published May 19, 2021 in math.OC , cs.SY , and eess.SY

Abstract

In this paper, we illustrate a novel method for solving optimization problems when derivatives are not explicitly available. We show that combining implicit filtering (IF), an existing derivative free optimization (DFO) method, with a deep neural network global approximator leads to an accelerated DFO method. Derivative free optimization problems occur in a wide variety of applications, including simulation based optimization and the optimization of stochastic processes, and naturally arise when the objective function can be viewed as a black box, such as a computer simulation. We highlight the practical value of our method, which we call deep neural network accelerated implicit filtering (DNNAIF), by demonstrating its ability to help solve the coverage directed generation (CDG) problem. Solving the CDG problem is a key part of the design and verification process for new electronic circuits, including the chips that power modern servers and smartphones.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.