Sharp variance-entropy comparison for nonnegative Gaussian quadratic forms (2005.11705v4)
Abstract: In this article we study weighted sums of $n$ i.i.d. Gamma($\alpha$) random variables with nonnegative weights. We show that for $n \geq 1/\alpha$ the sum with equal coefficients maximizes differential entropy when variance is fixed. As a consequence, we prove that among nonnegative quadratic forms in $n$ independent standard Gaussian random variables, a diagonal form with equal coefficients maximizes differential entropy, under a fixed variance. This provides a sharp lower bound for the relative entropy between a nonnegative quadratic form and a Gaussian random variable. Bounds on capacities of transmission channels subject to $n$ independent additive gamma noises are also derived.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.