DEV Community

Harsimranjit Singh
Harsimranjit Singh

Posted on

Polynomial Regression: Exploring Non-Linear Relationships

In our previous discussions, we explored the fundamentals of linear regression and gradient descent optimization. Today, we discuss a new topic - Polynomial regression. This technique empowers us to capture non-linear relationships between independent and dependent variables, a more flexible approach when a straight line does not fit in the data.

Beyond Straight Lines

Linear regression assumes a linear relationship between the independent variables and the dependent variables. However, real-world data often exhibits different patterns.
Polynomial regression addresses this by introducing polynomial terms of the independent variables. If we have one variable X then we can transform this variable like X^2, X^3, and so on. These terms allow the model to capture curves, bends, and other non-linear trends in the data

Image description
Here:

  • Y: Dependent variable
  • b0: The intercept term(bias)
  • b_i: Coefficients associated with each terms (i=1 to d)
  • X: independent variables
  • X^i: The polynomial terms of X (i=1 to d)

Let's take the example of a small dataset

Suppose we have a dataset representing the relationship between hours studied (x) and exam scores (y)

Image description

let's first visualize the dataset

import numpy as np
import matplotlib.pyplot as plt

hours_studied = np.array([1, 2, 3, 4, 5])
exam_scores = np.array([50, 65, 75, 80, 82])

plt.scatter(hours_studied, exam_scores, color='blue', label='Data points')
plt.xlabel('Hours studied')
plt.ylabel('Exam score')
plt.legend()
plt.grid(True)
plt.show()
Enter fullscreen mode Exit fullscreen mode

The shows a rough upward trend between hours studied and exam scores this type of relation can not be captured by a straight line

Image description

So to implement the polynomial regression we need to first modify our independent variable using polynomial features which transform the features into polynomial features like we specify the degree 2 then it make two columns out of one which are x (original) and x^2 (transformed).

from sklearn.preprocessing import PolynomialFeatures
from sklearn.linear_model import LinearRegression

X = hours_studied.reshape(-1, 1)
y = exam_scores.reshape(-1, 1)

# Polynomial features (change the dataset)
poly = PolynomialFeatures(degree=3)
X_poly = poly.fit_transform(X)


model = LinearRegression() # use the normal linear regression model
model.fit(X_poly, y)

y_pred = model.predict(X_poly)

plt.scatter(hours_studied, exam_scores, color='blue', label='Data points')
plt.plot(hours_studied, y_pred, color='red', label='Polynomial Regression')
plt.xlabel('Hours Studied')
plt.ylabel('Exam Score')
plt.title('Polynomial Regression: Hours Studied vs. Exam Score')
plt.legend()
plt.grid(True)
plt.show()

Enter fullscreen mode Exit fullscreen mode

Image description

The red curve represents the polynomial regression line fitted to the data. In the above example, we use the degree 3 polynomial.
By doing so we can capture the non-linear data as well.

Choosing the Right Degree:

The degree of polynomials dictates the model's complexity. we will encounter a trade-off here:

  • Higher Degrees: Capture a more complex relationship but can lead to overfitting, where it performs well on training data and performs poorly on unseen data.

  • Lower Degrees: Less prone to overfitting but might miss important non-linear patterns.

Estimations of Coefficients:

The coefficients in polynomial regression can be estimated using OLS the same method used for linear regression.

Conclusion:

Polynomial regression is a powerful statistical technique used to model complex relationships between the data. It can capture non-linear patterns that linear regression might miss.

Some important points to remember

  • Polynomial regression assumes that the relationship between the x and y is polynomial.

  • There are several types of polynomial regression simple, multiple, and orthogonal polynomial regression

  • The interpretation of the coefficients in polynomial regression is similar to linear regression, with the addition of higher-degree terms.

  • The polynomial regression also assumes that the error terms are randomly distributed with mean of zero.

Top comments (0)