DEV Community

Saoni Deb
Saoni Deb

Posted on • Edited on

2 1

Recursive Feature Elimination

Recursive Feature Elimination or RFE is primarily used for Feature ranking.

  • Given an external estimator that assigns weights to features (e.g., the coefficients of a linear model), the goal of recursive feature elimination (RFE) is to select features by recursively considering smaller and smaller sets of features.

  • First, the estimator is trained on the initial set of features and the importance of each feature is obtained either through any specific attribute or callable.

  • Then, the least important features are pruned from current set of features.

  • That procedure is recursively repeated on the pruned set until the desired number of features to select is eventually reached.

RFECV performs RFE in a cross-validation loop to find the optimal number of features.

First lets get to know how many types of feature selection is provided in sklearn.
The classes in the sklearn.feature_selection module can be used for feature selection/dimensionality reduction on sample sets, either to improve estimators’ accuracy scores or to boost their performance on very high-dimensional datasets.

  1. Removing features with low variance_(Not applicable for dynamic data)_: >VarianceThreshold is a simple baseline approach to feature selection. >It removes all features whose variance doesn’t meet some threshold.
  2. Univariate feature selection_(Not applicable for multivariate data): >Univariate feature selection works by selecting the best features based on univariate statistical tests. > It can be seen as a preprocessing step to an estimator. > _SelectKBest removes all but the highest scoring features > SelectPercentile removes all but a user-specified highest scoring percentage of features using common univariate statistical tests for each feature: false positive rate SelectFpr, false discovery rate SelectFdr, or family wise error SelectFwe. > GenericUnivariateSelect allows to perform univariate feature selection with a configurable strategy. This allows to select the best univariate selection strategy with hyper-parameter search estimator.
  3. Recursive feature elimination > Recursive feature elimination with cross-validation is also available
  4. Feature selection using SelectFromModel
  5. Sequential Feature Selection
  6. Feature selection as part of a pipeline

Hostinger image

Get n8n VPS hosting 3x cheaper than a cloud solution

Get fast, easy, secure n8n VPS hosting from $4.99/mo at Hostinger. Automate any workflow using a pre-installed n8n application and no-code customization.

Start now

Top comments (0)

A Workflow Copilot. Tailored to You.

Pieces.app image

Our desktop app, with its intelligent copilot, streamlines coding by generating snippets, extracting code from screenshots, and accelerating problem-solving.

Read the docs

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay