Emergent Mind

Compression in the Space of Permutations

(1406.7435)
Published Jun 28, 2014 in cs.IT and math.IT

Abstract

We investigate lossy compression (source coding) of data in the form of permutations. This problem has direct applications in the storage of ordinal data or rankings, and in the analysis of sorting algorithms. We analyze the rate-distortion characteristic for the permutation space under the uniform distribution, and the minimum achievable rate of compression that allows a bounded distortion after recovery. Our analysis is with respect to different practical and useful distortion measures, including Kendall-tau distance, Spearman's footrule, Chebyshev distance and inversion-$\ell_1$ distance. We establish equivalence of source code designs under certain distortions and show simple explicit code designs that incur low encoding/decoding complexities and are asymptotically optimal. Finally, we show that for the Mallows model, a popular nonuniform ranking model on the permutation space, both the entropy and the maximum distortion at zero rate are much lower than the uniform counterparts, which motivates the future design of efficient compression schemes for this model.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.