Categories
Uncategorized

Confusion Matrix

What Is a Confusion Matrix?

A confusion matrix is a popular statistic for resolving classification tasks in machine learning. Typically, after collecting, cleaning, wrangling, and manipulating data, the first thing we do is feed it to an excellent model and make predictions from it. However, it is critical to assess the effectiveness of our model, and for that goal, we use the confusion matrix.

A confusion matrix is a key measure for machine learning classification problems with two or more classes as output. It is simply a summary of prediction outcomes for a classification task. The confusion matrix displays the number of both accurate and inaccurate predictions in a summarized form with count values split down by class. It is a table that helps in representing the performance of a classifier on a set of test data. Here, both true and predicted values are known. The confusion matrix assists us by offering insights into the sorts of errors generated by our classifier.

Classification Accuracy

Before we discuss more about this topic, let me give you some history on classification accuracy. This will help you understand how a confusion matrix outperforms other methods.

One parameter for evaluating classification models is accuracy. We can calculate accuracy by dividing the number of correct predictions by the total number of predictions.

We can also compute it by using positives and negatives for binary classification.

The following is the formula for it:

classification accuracy

Where:

  • True Positives (TP): These are instances where what we predicted to be true proved to be true. A person who had COVID 19 symptoms, for example, predicted that he/she had caught Corona. The person chose to undergo the coronavirus test, and the result proved to be positive.
  • True Negatives (TN): These are instances where what we predicted to be false turned out to be false. Consider the same circumstance as previously, only this time the person with COVID 19 symptoms was aware that he/she did not get corona. The person chose to undergo the coronavirus test, and the result was negative.
  • False Positives (FP): These are instances where what we predicted to be true turned out to be false. Another name for FP is Type I Error. Consider the previous scenario, but this time the person felt confident that he/she had coronavirus. The person chose to undergo the coronavirus test, and the result turned out to be negative.
  • False Negatives (FN): These are instances where what we predicted to be false turned out to be true. Another name for FN is Type II Error. For the last instance, consider the similar scenario, but this time the person was confident that he/she did not contract the virus. The person took the coronavirus test anyway and it turned out to be positive.

We can easily turn the accuracy into an error rate by inverting its value, such as:

Error Rate = (1 - (number of correct predictions / total number of predictions)) * 100

Problems People Face When Employing This Method

Normally, we perform classification predictive analysis on small datasets with almost equal or entirely equal class distributions. As a result, most practitioners end up believing that high accuracy scores are favorable, and values greater than 90% are excellent. This type of thinking enables the practitioner to believe that a model performs exceptionally well when, in reality, it might not be performing that well.

The major problem with employing classification accuracy is that we don’t receive as many insights into our prediction. Assume our data includes three or more classes and a classification accuracy of 80%. We don’t know for sure if the result we received was due to the model correctly predicting all classes or if the model overlooked 1-2 classes.

We wouldn’t be so hesitant if we had used the confusion matrix since it would have given us information about the errors as well as the types of errors in our model.

How to Calculate a Confusion Matrix?

Confusion Matrix

We use this 2 x 2 matrix (as shown above) for binary classification problems. In this case, a target variable has two possible values: positive or negative. The columns indicate the target variable’s actual values, while the rows show the expected or predicted values of that variable. If you paid attention to my article, you must be familiar with the terms TP, FP, FN, and TN by now.

The steps for plotting a confusion matrix are as follows:

  • Firstly, you need have a test dataset with predicted result values ready.
  • Secondly, for each row in your test dataset, make a prediction.
  • Lastly, tally the number of correct predictions and the number of false predictions for each class based on the expected results and forecasts.
  • These were the only steps necessary for plotting a matrix.
  • The matrix consists of the total number of correct and incorrect predictions for a class. These numbers are placed into an expected row and predicted column for that class value.

Example

Assume we have a two-class classification problem in which we need to predict whether an image depicts a dog or a cat.
The first thing we need is a test dataset, which we presently have with 10 records of predicted outcomes and a set of predictions from our model.

Test dataset
Test Dataset

When we calculate the classification accuracy for this, we get an accuracy of 60%:

accuracy = total number of correct predictions / total number of predictions * 100
accuracy = 6 / 10 * 100 = 60%

The following are the correct and incorrect predictions that we have made for each class:

  • Dog classified as dog: 3
  • Cat classified as cat: 3
  • Dog classified as cat: 1
  • Cat classified as dog: 3

We can now easily arrange these values into a 2 x 2 confusion matrix:

Confusion Matrix
Confusion Matrix

We can deduce from this matrix that the total number of dogs in the dataset is the sum of the values in the dog column (3+1), and the same for cats (3+3). The correct values are structured in a diagonal line across the matrix (3+3) from top left to bottom right.

Benefits

We already know that a confusion matrix provides information about the classifier’s errors. It highlights how a classification model becomes disorganized or imbalanced when generating predictions. A confusion matrix helps overcome the difficulties of depending simply on classification accuracy. Not only that but, it can also calculate precision, recall, accuracy, specificity, and the AUC-ROC curve.

Plotting a Confusion Matrix in Python

# Plotting a confusion matrix in Python
from sklearn.metrics import confusion_matrix
expected = [0, 1, 1, 1, 0, 0, 1, 0, 0, 1]
predicted = [0, 0, 1, 0, 0, 0, 1, 0, 1, 0]
results = confusion_matrix(expected, predicted)
print(results)

When you run the code, you will receive the following result:

Confusion Matrix Python Code

Leave a Reply

Your email address will not be published. Required fields are marked *