Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
157 tokens/sec
GPT-4o
43 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Faster feature selection with a Dropping Forward-Backward algorithm (1910.08007v3)

Published 17 Oct 2019 in stat.ML and cs.LG

Abstract: In this era of big data, feature selection techniques, which have long been proven to simplify the model, makes the model more comprehensible, speed up the process of learning, have become more and more important. Among many developed methods, forward and stepwise feature selection regression remained widely used due to their simplicity and efficiency. However, they all involving rescanning all the un-selected features again and again. Moreover, many times, the backward steps in stepwise deem unnecessary, as we will illustrate in our example. These remarks motivate us to introduce a novel algorithm that may boost the speed up to 65.77% compared to the stepwise procedure while maintaining good performance in terms of the number of selected features and error rates. Also, our experiments illustrate that feature selection procedures may be a better choice for high-dimensional problems where the number of features highly exceeds the number of samples.

Citations (3)

Summary

We haven't generated a summary for this paper yet.