Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 170 tok/s
Gemini 2.5 Pro 46 tok/s Pro
GPT-5 Medium 33 tok/s Pro
GPT-5 High 31 tok/s Pro
GPT-4o 80 tok/s Pro
Kimi K2 191 tok/s Pro
GPT OSS 120B 432 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Uniform Convergence Rates for Maximum Likelihood Estimation under Two-Component Gaussian Mixture Models (2006.00704v1)

Published 1 Jun 2020 in math.ST, stat.ML, and stat.TH

Abstract: We derive uniform convergence rates for the maximum likelihood estimator and minimax lower bounds for parameter estimation in two-component location-scale Gaussian mixture models with unequal variances. We assume the mixing proportions of the mixture are known and fixed, but make no separation assumption on the underlying mixture components. A phase transition is shown to exist in the optimal parameter estimation rate, depending on whether or not the mixture is balanced. Key to our analysis is a careful study of the dependence between the parameters of location-scale Gaussian mixture models, as captured through systems of polynomial equalities and inequalities whose solution set drives the rates we obtain. A simulation study illustrates the theoretical findings of this work.

Citations (6)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.