This Article was originally published on CodePerfectPlus
Have you been in a situation where you expected your machine learning model to perform really well but it sputtered out a poor accuracy? You’ve done all the hard work – so where did the classification model go wrong? How can you correct this?
Here comes the
Confusion matrix is also known as an error matrix. The confusion matrix mostly used to evaluate the classification model performance. As the Name says confusion matrix can be confusing sometimes.
After Preprocessing, Cleaning, Fitting and Predicting the Data we check model performance to build a final production model.
What is Confusion Matrix
The confusion matrix mainly uses for binary classification. It's a comparison table of predict and actual values. There are two predicted class in binary classification
Confusion matrix consist of 4 class
You projected positive and it turns out to be true. For example, you had predicted that France would win the world cup, and it won.
When you predicted negative, and it's true. You had predicted that England would not win and it lost.
Your prediction is positive, and it is false.
You had predicted that England would win, but it lost.
Your prediction is negative, and the result is also false.
You had predicted that France would not win, but it won.
from sklearn.datasets import load_iris from sklearn.model_selection import train_test_split from sklearn.linear_model import LogisticRegression from sklearn.metrics import confusion_matrix iris = load_iris() X = iris.data y = iris.target X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=24) clf = LogisticRegression().fit(X_train, y_train) y_predict = clf.predict(X_test) cMatrix = confusion_matrix(y_test, y_predict) print(cMatrix)
More Articles by Author
- Build Your First Python Chatbot In 5 Minutes
- What is Simple Linear Regression?
- Logistic Regression for Machine Learning Problem
- 5 Tips for Computer Programming Beginners
- What Is Git and GitHub?
Join for Weekly Updates.
React ❤️ to encourage Author.
Top comments (0)