DEV Community

Cover image for Dimensionality Reduction: Generalized Discriminant Analysis

Dimensionality Reduction: Generalized Discriminant Analysis

Hey people,

Welcome to yet another exciting series of blogs. In this series, we are going to talk about dimensionality reduction techniques. Our thinking throughout this series should be oriented towards optimization. Every technique discussed here shall be keeping that in mind. However, this section will also involve some bit of math, but just to be inclusive, we shall keep it simple and straight, discussing only the concepts and how to perceive them.

In the last article we talked about the Linear Discriminant Analysis model and saw how it makes a projection of the data from the high dimensionality to a low dimensionality space. We saw it in working and also saw the use of LDA as a classifier. In this article, we are going to talk about Generalized Discriminant Analysis. Let's get it going then...

What is Generalized Discriminant Analysis?

GDA deals with nonlinear discriminant analysis using kernel function operator. The underlying theory is close to the support vector machines (SVM) insofar as the GDA method provides a mapping of the input vectors into high-dimensional feature space. Similar to LDA, the objective of GDA is to find a projection for the features into a lower-dimensional space by maximizing the ratio of between-class scatters to within-class scatter. The main idea is to map the input space into a convenient feature space in which variables are nonlinearly related to the input space.

I hope this was helpful and was able to put things down in a simple way. Please feel free to reach to me on Twitter @AashishLChaubey in case you need more clarity or have any suggestions.

Until next time...

Top comments (0)