Emergent Mind

Faster feature selection with a Dropping Forward-Backward algorithm

(1910.08007)
Published Oct 17, 2019 in stat.ML and cs.LG

Abstract

In this era of big data, feature selection techniques, which have long been proven to simplify the model, makes the model more comprehensible, speed up the process of learning, have become more and more important. Among many developed methods, forward and stepwise feature selection regression remained widely used due to their simplicity and efficiency. However, they all involving rescanning all the un-selected features again and again. Moreover, many times, the backward steps in stepwise deem unnecessary, as we will illustrate in our example. These remarks motivate us to introduce a novel algorithm that may boost the speed up to 65.77% compared to the stepwise procedure while maintaining good performance in terms of the number of selected features and error rates. Also, our experiments illustrate that feature selection procedures may be a better choice for high-dimensional problems where the number of features highly exceeds the number of samples.

We're not able to analyze this paper right now due to high demand.

Please check back later (sorry!).

Generate a summary of this paper on our Pro plan:

We ran into a problem analyzing this paper.

Newsletter

Get summaries of trending comp sci papers delivered straight to your inbox:

Unsubscribe anytime.