Emergent Mind

A Cramér-Rao Type Bound for Bayesian Risk with Bregman Loss

(2001.10982)
Published Jan 29, 2020 in cs.IT , math.IT , math.ST , and stat.TH

Abstract

A general class of Bayesian lower bounds when the underlying loss function is a Bregman divergence is demonstrated. This class can be considered as an extension of the Weinstein--Weiss family of bounds for the mean squared error and relies on finding a variational characterization of Bayesian risk. The approach allows for the derivation of a version of the Cram\'er--Rao bound that is specific to a given Bregman divergence. The new generalization of the Cram\'er--Rao bound reduces to the classical one when the loss function is taken to be the Euclidean norm. The effectiveness of the new bound is evaluated in the Poisson noise setting and the Binomial noise setting.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.