Emergent Mind

Achievable Refined Asymptotics for Successive Refinement Using Gaussian Codebooks

(2208.03926)
Published Aug 8, 2022 in cs.IT and math.IT

Abstract

We study the mismatched successive refinement problem where one uses Gaussian codebooks to compress an arbitrary memoryless source with successive minimum Euclidean distance encoding under the quadratic distortion measure. Specifically, we derive achievable refined asymptotics under both the joint excess-distortion probability (JEP) and the separate excess-distortion probabilities (SEP) criteria. For both second-order and moderate deviations asymptotics, we consider two types of codebooks: the spherical codebook where each codeword is drawn independently and uniformly from the surface of a sphere and the i.i.d. Gaussian codebook where each component of each codeword is drawn independently from a Gaussian distribution. We establish the achievable second-order rate-region under JEP and we show that under SEP any memoryless source satisfying mild moment conditions is strongly successively refinable. When specialized to a Gaussian memoryless source (GMS), our results provide an alternative achievability proof with specific code design. We show that under JEP and SEP, the same moderate deviations constant is achievable. For large deviations asymptotics, we only consider the i.i.d. Gaussian codebook since the i.i.d. Gaussian codebook has better performance than the spherical codebook in this regime for the one layer mismatched rate-distortion problem (Zhou, Tan, Motani, TIT, 2019). We derive achievable exponents of both JEP and SEP and specialize our results to a GMS, which appears to be a novel result of independent interest.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.