A Class of Nonbinary Symmetric Information Bottleneck Problems (2110.00985v1)
Abstract: We study two dual settings of information processing. Let $ \mathsf{Y} \rightarrow \mathsf{X} \rightarrow \mathsf{W} $ be a Markov chain with fixed joint probability mass function $ \mathsf{P}_{\mathsf{X}\mathsf{Y}} $ and a mutual information constraint on the pair $ (\mathsf{W},\mathsf{X}) $. For the first problem, known as Information Bottleneck, we aim to maximize the mutual information between the random variables $ \mathsf{Y} $ and $ \mathsf{W} $, while for the second problem, termed as Privacy Funnel, our goal is to minimize it. In particular, we analyze the scenario for which $ \mathsf{X} $ is the input, and $ \mathsf{Y} $ is the output of modulo-additive noise channel. We provide analytical characterization of the optimal information rates and the achieving distributions.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.