Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 149 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 34 tok/s Pro
GPT-5 High 41 tok/s Pro
GPT-4o 73 tok/s Pro
Kimi K2 207 tok/s Pro
GPT OSS 120B 442 tok/s Pro
Claude Sonnet 4.5 38 tok/s Pro
2000 character limit reached

Relative $α$-Entropy Minimizers Subject to Linear Statistical Constraints (1410.4931v1)

Published 18 Oct 2014 in cs.IT, math.IT, math.ST, and stat.TH

Abstract: We study minimization of a parametric family of relative entropies, termed relative $\alpha$-entropies (denoted $\mathscr{I}{\alpha}(P,Q)$). These arise as redundancies under mismatched compression when cumulants of compressed lengths are considered instead of expected compressed lengths. These parametric relative entropies are a generalization of the usual relative entropy (Kullback-Leibler divergence). Just like relative entropy, these relative $\alpha$-entropies behave like squared Euclidean distance and satisfy the Pythagorean property. Minimization of $\mathscr{I}{\alpha}(P,Q)$ over the first argument on a set of probability distributions that constitutes a linear family is studied. Such a minimization generalizes the maximum R\'{e}nyi or Tsallis entropy principle. The minimizing probability distribution (termed $\mathscr{I}_{\alpha}$-projection) for a linear family is shown to have a power-law.

Citations (4)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.