Emergent Mind

Abstract

The minimum mean-square error (MMSE) achievable by optimal estimation of a random variable $Y\in\mathbb{R}$ given another random variable $X\in\mathbb{R}{d}$ is of much interest in a variety of statistical settings. In the context of estimation-theoretic privacy, the MMSE has been proposed as an information leakage measure that captures the ability of an adversary in estimating $Y$ upon observing $X$. In this paper we establish provable lower bounds for the MMSE based on a two-layer neural network estimator of the MMSE and the Barron constant of an appropriate function of the conditional expectation of $Y$ given $X$. Furthermore, we derive a general upper bound for the Barron constant that, when $X\in\mathbb{R}$ is post-processed by the additive Gaussian mechanism and $Y$ is binary, produces order optimal estimates in the large noise regime. In order to obtain numerical lower bounds for the MMSE in some concrete applications, we introduce an efficient optimization process that approximates the value of the proposed neural network estimator. Overall, we provide an effective machinery to obtain provable lower bounds for the MMSE.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.