Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Symmetry of Information: A Closer Look (1206.5184v1)

Published 22 Jun 2012 in cs.IT and math.IT

Abstract: Symmetry of information establishes a relation between the information that x has about y (denoted I(x : y)) and the information that y has about x (denoted I(y : x)). In classical information theory, the two are exactly equal, but in algorithmical information theory, there is a small excess quantity of information that differentiates the two terms, caused by the necessity of packaging information in a way that makes it accessible to algorithms. It was shown in [Zim11] that in the case of strings with simple complexity (that is the Kolmogorov complexity of their Kolmogorov complexity is small), the relevant information can be packed in a very economical way, which leads to a tighter relation between I(x : y) and I(y : x) than the one provided in the classical symmetry-of-information theorem of Kolmogorov and Levin. We give here a simpler proof of this result, using a suggestion of Alexander Shen. This result implies a van Lambalgen- type theorem for finite strings and plain complexity: If x is c-random and y is c-random relative to x, then xy is O(c)-random. We show that a similar result holds for prefix-free complexity and weak-K-randomness.

Citations (1)

Summary

We haven't generated a summary for this paper yet.