Information Bottleneck on General Alphabets (1801.01050v2)
Abstract: We prove rigorously a source coding theorem that can probably be considered folklore, a generalization to arbitrary alphabets of a problem motivated by the Information Bottleneck method. For general random variables $(Y, X)$, we show essentially that for some $n \in \mathbb{N}$, a function $f$ with rate limit $\log|f| \le nR$ and $I(Yn; f(Xn)) \ge nS$ exists if and only if there is a random variable $U$ such that the Markov chain $Y - X - U$ holds, $I(U; X) \le R$ and $I(U; Y) \ge S$. The proof relies on the well established discrete case and showcases a technique for lifting discrete coding theorems to arbitrary alphabets.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.