DEV Community

Bharath Prasad
Bharath Prasad

Posted on

Understanding Cost Function in Machine Learning

When building machine learning models, one of the most important questions is: how do we know if the model is learning correctly? The answer lies in the cost function.

A cost function is basically a score that tells us how far the model’s predictions are from the actual values. If the cost is low, the model is accurate. If the cost is high, it still has a lot to improve.

Think of it like playing darts. The bullseye is the correct answer, and every dart you throw is a prediction. The distance between the dart and the bullseye is the cost. The smaller the distance, the better you are.

One of the most widely used formulas is the Mean Squared Error (MSE) for regression tasks. It works like this:

Take the difference between predicted and actual values.

Square those differences.

Average them across all data points.

The result is a single number that represents how accurate the model is overall.

Beyond measurement, cost functions also play a key role in optimisation. Algorithms such as gradient descent rely on the cost function to adjust model parameters step by step until the error is minimised.

In classification tasks like logistic regression, the cost function takes a different form but serves the same purpose — guiding the model toward better predictions.

For beginners stepping into machine learning or data science, understanding cost functions is a must. It’s the foundation for building models that actually work in practice.

Top comments (0)