Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Projection Theorems of Divergences and Likelihood Maximization Methods (1705.09898v2)

Published 28 May 2017 in cs.IT, math.IT, math.PR, math.ST, and stat.TH

Abstract: Projection theorems of divergences enable us to find reverse projection of a divergence on a specific statistical model as a forward projection of the divergence on a different but rather "simpler" statistical model, which, in turn, results in solving a system of linear equations. Reverse projection of divergences are closely related to various estimation methods such as the maximum likelihood estimation or its variants in robust statistics. We consider projection theorems of three parametric families of divergences that are widely used in robust statistics, namely the R\'enyi divergences (or the Cressie-Reed power divergences), density power divergences, and the relative $\alpha$-entropy (or the logarithmic density power divergences). We explore these projection theorems from the usual likelihood maximization approach and from the principle of sufficiency. In particular, we show the equivalence of solving the estimation problems by the projection theorems of the respective divergences and by directly solving the corresponding estimating equations. We also derive the projection theorem for the density power divergences.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Atin Gayen (4 papers)
  2. M. Ashok Kumar (15 papers)

Summary

We haven't generated a summary for this paper yet.