Emergent Mind

Asymptotically Tight Bayesian Cramér-Rao Bound

(2311.13834)
Published Nov 23, 2023 in cs.IT , eess.SP , and math.IT

Abstract

Performance bounds for parameter estimation play a crucial role in statistical signal processing theory and applications. Two widely recognized bounds are the Cram\'{e}r-Rao bound (CRB) in the non-Bayesian framework, and the Bayesian CRB (BCRB) in the Bayesian framework. However, unlike the CRB, the BCRB is asymptotically unattainable in general, and its equality condition is restrictive. This paper introduces an extension of the Bobrovsky--Mayer-Wolf--Zakai class of bounds, also known as the weighted BCRB (WBCRB). The WBCRB is optimized by tuning the weighting function in the scalar case. Based on this result, we propose an asymptotically tight version of the bound called AT-BCRB. We prove that the AT-BCRB is asymptotically attained by the maximum {\it a-posteriori} probability (MAP) estimator. Furthermore, we extend the WBCRB and the AT-BCRB to the case of vector parameters. The proposed bounds are evaluated in several fundamental signal processing examples, such as variance estimation of white Gaussian process, direction-of-arrival estimation, and mean estimation of Gaussian process with unknown variance and prior statistical information. It is shown that unlike the BCRB, the proposed bounds are asymptotically attainable and coincide with the expected CRB (ECRB). The ECRB, which imposes uniformly unbiasedness, cannot serve as a valid lower bound in the Bayesian framework, while the proposed bounds are valid for any estimator.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.