Emergent Mind

Lossy Compression with Universal Distortion

(2110.07022)
Published Oct 13, 2021 in cs.IT and math.IT

Abstract

We consider a novel variant of $d$-semifaithful lossy coding in which the distortion measure is revealed only to the encoder and only at run-time, as well as an extension of it in which the distortion constraint $d$ is also revealed at run-time. Two forms of rate redundancy are used to analyze the performance, and achievability results of both a pointwise and minimax nature are demonstrated. The first coding scheme uses ideas from VC dimension and growth functions, the second uses appropriate quantization of the space of distortion measures, and the third relies on a random coding argument.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.