Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Supervised Linear Dimension-Reduction Methods: Review, Extensions, and Comparisons (2109.04244v1)

Published 9 Sep 2021 in stat.ML and cs.LG

Abstract: Principal component analysis (PCA) is a well-known linear dimension-reduction method that has been widely used in data analysis and modeling. It is an unsupervised learning technique that identifies a suitable linear subspace for the input variable that contains maximal variation and preserves as much information as possible. PCA has also been used in prediction models where the original, high-dimensional space of predictors is reduced to a smaller, more manageable, set before conducting regression analysis. However, this approach does not incorporate information in the response during the dimension-reduction stage and hence can have poor predictive performance. To address this concern, several supervised linear dimension-reduction techniques have been proposed in the literature. This paper reviews selected techniques, extends some of them, and compares their performance through simulations. Two of these techniques, partial least squares (PLS) and least-squares PCA (LSPCA), consistently outperform the others in this study.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Shaojie Xu (6 papers)
  2. Joel Vaughan (15 papers)
  3. Jie Chen (602 papers)
  4. Agus Sudjianto (34 papers)
  5. Vijayan Nair (2 papers)
Citations (2)

Summary

We haven't generated a summary for this paper yet.