Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 157 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 31 tok/s Pro
GPT-5 High 33 tok/s Pro
GPT-4o 88 tok/s Pro
Kimi K2 160 tok/s Pro
GPT OSS 120B 397 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

Compression in the Space of Permutations (1406.7435v3)

Published 28 Jun 2014 in cs.IT and math.IT

Abstract: We investigate lossy compression (source coding) of data in the form of permutations. This problem has direct applications in the storage of ordinal data or rankings, and in the analysis of sorting algorithms. We analyze the rate-distortion characteristic for the permutation space under the uniform distribution, and the minimum achievable rate of compression that allows a bounded distortion after recovery. Our analysis is with respect to different practical and useful distortion measures, including Kendall-tau distance, Spearman's footrule, Chebyshev distance and inversion-$\ell_1$ distance. We establish equivalence of source code designs under certain distortions and show simple explicit code designs that incur low encoding/decoding complexities and are asymptotically optimal. Finally, we show that for the Mallows model, a popular nonuniform ranking model on the permutation space, both the entropy and the maximum distortion at zero rate are much lower than the uniform counterparts, which motivates the future design of efficient compression schemes for this model.

Citations (21)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube