Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 42 tok/s
Gemini 2.5 Pro 53 tok/s Pro
GPT-5 Medium 17 tok/s Pro
GPT-5 High 13 tok/s Pro
GPT-4o 101 tok/s Pro
Kimi K2 217 tok/s Pro
GPT OSS 120B 474 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

Theoretical Guarantees for the Subspace-Constrained Tyler's Estimator (2403.18658v2)

Published 27 Mar 2024 in math.ST, stat.ML, and stat.TH

Abstract: This work analyzes the subspace-constrained Tyler's estimator (STE) designed for recovering a low-dimensional subspace within a dataset that may be highly corrupted with outliers. It assumes a weak inlier-outlier model and allows the fraction of inliers to be smaller than a fraction that leads to computational hardness of the robust subspace recovery problem. It shows that in this setting, if the initialization of STE, which is an iterative algorithm, satisfies a certain condition, then STE can effectively recover the underlying subspace. It further shows that under the generalized haystack model, STE initialized by the Tyler's M-estimator (TME), can recover the subspace when the fraction of iniliers is too small for TME to handle.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (14)
  1. R. Bhatia. Positive Definite Matrices. Princeton University Press, 2009.
  2. G. Lerman F. Yu, T. Zhang. A subspace-constrained tyler’s estimator and its applications to structure from motion. 2024. Accepted to CVPR 2024.
  3. Algorithms and hardness for robust subspace recovery. In Conference on Learning Theory, pages 354–375. PMLR, 2013.
  4. Bruno Iannazzo. The geometric mean of two matrices from a computational viewpoint. Numerical Linear Algebra with Applications, 23(2):208–229, 2016.
  5. Maximum likelihood estimation for the wrapped cauchy distribution. Journal of Applied Statistics, 15(2):247–254, 1988.
  6. An overview of robust subspace recovery. Proceedings of the IEEE, 106(8):1380–1410, 2018.
  7. Robust computation of linear models by convex relaxation. Found. Comput. Math., 15(2):363–410, 2015.
  8. Robust subspace recovery with adversarial outliers, 2019.
  9. A well-tempered landscape for non-convex robust subspace recovery. J. Mach. Learn. Res., 20(1):1348–1406, 2019.
  10. T. Tao. Topics in Random Matrix Theory. American Mathematical Society, 2012.
  11. David E Tyler. A distribution-free m-estimator of multivariate scatter. The Annals of Statistics, pages 234–251, 1987.
  12. Structured robust covariance estimation. Foundations and Trends® in Signal Processing, 8(3):127–216, 2015.
  13. Teng Zhang. Robust subspace recovery by tyler’s m-estimator. Information and Inference: A Journal of the IMA, 5(1):1–21, 2016.
  14. A novel m-estimator for robust PCA. J. Mach. Learn. Res., 15(1):749–808, 2014.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Lightbulb On Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube