A Connection between Good Rate-distortion Codes and Backward DMCs (1307.7770v1)
Abstract: Let $Xn\in\mathcal{X}n$ be a sequence drawn from a discrete memoryless source, and let $Yn\in\mathcal{Y}n$ be the corresponding reconstruction sequence that is output by a good rate-distortion code. This paper establishes a property of the joint distribution of $(Xn,Yn)$. It is shown that for $D>0$, the input-output statistics of a $R(D)$-achieving rate-distortion code converge (in normalized relative entropy) to the output-input statistics of a discrete memoryless channel (dmc). The dmc is "backward" in that it is a channel from the reconstruction space $\mathcal{Y}n$ to source space $\mathcal{X}n$. It is also shown that the property does not necessarily hold when normalized relative entropy is replaced by variational distance.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.