Emergent Mind

Simulation of Random Variables under Rényi Divergence Measures of All Orders

(1805.12451)
Published May 31, 2018 in cs.IT , cs.CR , math.IT , and math.PR

Abstract

The random variable simulation problem consists in using a $k$-dimensional i.i.d. random vector $X{k}$ with distribution $P{X}{k}$ to simulate an $n$-dimensional i.i.d. random vector $Y{n}$ so that its distribution is approximately $Q{Y}{n}$. In contrast to previous works, in this paper we consider the standard R\'enyi divergence and two variants of all orders to measure the level of approximation. These two variants are the max-R\'enyi divergence $D{\alpha}{\mathsf{max}}(P,Q)$ and the sum-R\'enyi divergence $D{\alpha}{+}(P,Q)$. When $\alpha=\infty$, these two measures are strong because for any $\epsilon>0$, $D{\infty}{\mathsf{max}}(P,Q)\leq\epsilon$ or $D{\infty}{+}(P,Q)\leq\epsilon$ implies $e{-\epsilon}\leq\frac{P(x)}{Q(x)}\leq e{\epsilon}$ for all $x$. Under these R\'enyi divergence measures, we characterize the asymptotics of normalized divergences as well as the R\'enyi conversion rates. The latter is defined as the supremum of $\frac{n}{k}$ such that the R\'enyi divergences vanish asymptotically. In addition, when the R\'enyi parameter is in the interval $(0,1)$, the R\'enyi conversion rates equal the ratio of the Shannon entropies $\frac{H\left(P{X}\right)}{H\left(Q{Y}\right)}$, which is consistent with traditional results in which the total variation measure was adopted. When the R\'enyi parameter is in the interval $(1,\infty]$, the R\'enyi conversion rates are, in general, smaller than $\frac{H\left(P{X}\right)}{H\left(Q{Y}\right)}$. When specialized to the case in which either $P{X}$ or $Q{Y}$ is uniform, the simulation problem reduces to the source resolvability and intrinsic randomness problems. The preceding results are used to characterize the asymptotics of R\'enyi divergences and the R\'enyi conversion rates for these two cases.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.