Emergent Mind

Output Constrained Lossy Source Coding with Limited Common Randomness

(1411.5767)
Published Nov 21, 2014 in cs.IT and math.IT

Abstract

This paper studies a Shannon-theoretic version of the generalized distribution preserving quantization problem where a stationary and memoryless source is encoded subject to a distortion constraint and the additional requirement that the reproduction also be stationary and memoryless with a given distribution. The encoder and decoder are stochastic and assumed to have access to independent common randomness. Recent work has characterized the minimum achievable coding rate at a given distortion level when unlimited common randomness is available. Here we consider the general case where the available common randomness may be rate limited. Our main result completely characterizes the set of achievable coding and common randomness rate pairs at any distortion level, thereby providing the optimal tradeoff between these two rate quantities. We also consider two variations of this problem where we investigate the effect of relaxing the strict output distribution constraint and the role of `private randomness' used by the decoder on the rate region. Our results have strong connections with Cuff's recent work on distributed channel synthesis. In particular, our achievability proof combines a coupling argument with the approach developed by Cuff, where instead of explicitly constructing the encoder-decoder pair, a joint distribution is constructed from which a desired encoder-decoder pair is established. We show however that for our problem, the separated solution of first finding an optimal channel and then synthesizing this channel results in a suboptimal rate region.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.