Emergent Mind

Information theory with finite vector spaces

(1807.05152)
Published Jul 13, 2018 in math-ph , cs.IT , math.IT , math.MP , and math.PR

Abstract

Whereas Shannon entropy is related to the growth rate of multinomial coefficients, we show that the quadratic entropy (Tsallis 2-entropy) is connected to their $q$-deformation; when $q$ is a prime power, these $q$-multinomial coefficients count flags of finite vector spaces with prescribed length and dimensions. In particular, the $q$-binomial coefficients count vector subspaces of given dimension. We obtain this way a combinatorial explanation for the nonadditivity of the quadratic entropy, which arises from a recursive counting of flags. We show that statistical systems whose configurations are described by flags provide a frequentist justification for the maximum entropy principle with Tsallis statistics. We introduce then a discrete-time stochastic process associated to the $q$-binomial probability distribution, that generates at time $n$ a vector subspace of $\mathbb{F}qn$ (here $\mathbb{F}q$ is the finite field of order $q$). The concentration of measure on certain "typical subspaces" allows us to extend the asymptotic equipartition property to this setting. The size of the typical set is quantified by the quadratic entropy. We discuss the applications to Shannon theory, particularly to source coding, when messages correspond to vector spaces.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.