DEV Community

Cover image for How Models Measure Error: The Sum of Squared Residuals
Rijul Rajesh
Rijul Rajesh

Posted on

How Models Measure Error: The Sum of Squared Residuals

In this article we will get an understanding of Sum of squared residuals

When learning backpropagation, this is a rule that we have to use often because it defines how we measure and reduce error in a model.


What is a residual?

Suppose you have some real data points and a line that tries to fit them.

  • Actual value → what really happened
  • Predicted value → what your model says

The residual is:

  • residual = actual - predicted

So, it tells you how wrong your prediction is for one data point.


Reason for squaring the residuals

This makes all the errors positive and also penalizes large errors more than small ones.


Sum of squared residuals (SSR)

Now, when you do this for every data point and add them together, you will get this equation:

SSR = ∑(actual − predicted)²
Enter fullscreen mode Exit fullscreen mode

This is called the sum of squared residuals, which measures the total error of the model.


The rule

Here, the rule is basically this:

The best-fitting model is the one that minimizes the sum of squared residuals.

This is known as the Least Squares Principle.


Wrapping up

This is another byte-sized piece to know before going into backpropagation.


If you’ve ever struggled with repetitive tasks, obscure commands, or debugging headaches, this platform is here to make your life easier. It’s free, open-source, and built with developers in mind.

👉 Explore the tools: FreeDevTools
👉 Star the repo: freedevtools

Top comments (0)