Emergent Mind

Abstract

We study stochastic zeroth order gradient and Hessian estimators for real-valued functions in $\mathbb{R}n$. We show that, via taking finite difference along random orthogonal directions, the variance of the stochastic finite difference estimators can be significantly reduced. In particular, we design estimators for smooth functions such that, if one uses $ \Theta \left( k \right) $ random directions sampled from the Stiefel's manifold $ \text{St} (n,k) $ and finite-difference granularity $\delta$, the variance of the gradient estimator is bounded by $ \mathcal{O} \left( \left( \frac{n}{k} - 1 \right) + \left( \frac{n2}{k} - n \right) \delta2 + \frac{ n2 \delta4 }{ k } \right) $, and the variance of the Hessian estimator is bounded by $\mathcal{O} \left( \left( \frac{n2}{k2} - 1 \right) + \left( \frac{n4}{k2} - n2 \right) \delta2 + \frac{n4 \delta4 }{k2} \right) $. When $k = n$, the variances become negligibly small. In addition, we provide improved bias bounds for the estimators. The bias of both gradient and Hessian estimators for smooth function $f$ is of order $\mathcal{O} \left( \delta2 \Gamma \right)$, where $\delta$ is the finite-difference granularity, and $ \Gamma $ depends on high order derivatives of $f$. Our results are evidenced by empirical observations.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.