Skip to main content

Calculate Confusion Matrix Manually

 This article was originally published at medium.

  
 
 
 

To understand the terminologies properly, I will take a simple binary classification problem. Let’s say, our dataset contains the product reviews of an e-commerce website. Each review has a label, either positive (1) or negative (0). Our task is to classify whether a review is positive or negative. Let’s assume, using different NLP techniques, we have made a good/bad model that can predict the labels somehow. For example, the below CSV file snap is the sample of our actual and predicted labels after the prediction that our model made.

 
fig 1: our sample product review predictions against actual true labels                     
 
 

True Positive (TP)

True Negative (TN)

False Positive (FP)

False Negative (FN)

Calculate the Confusion Matrix

fig 2: TP, TN, FP, FN values of our model prediction
 

Calculate Accuracy

formula for calculating accuracy

If you place the values for these terms in the above formula and calculate the simple math, you will get the accuracy number. In our case, it is: 0.65
Which means the accuracy is 65%.

Calculate Precision

 
formula for calculating precision
 

Replace the values of these terms, and calculate the simple math, it would be 0.636

Calculate Recall | Sensitivity | True Positive Rate — TPR

 
formula for calculating recall or sensitivity

Replace the values of these terms, and calculate the simple math, it would be 0.70

Calculate the F1 Score

 
formula for calculating F1 score

Replace the values of these terms, and calculate the simple math, it would be 0.667

 

Calculate False Positive Rate — FPR

 
formula for calculating FPR

Replace the values of these terms, and calculate the simple math, it would be 0.4

ROC Curve

It is a graph generated by plotting False Positive Rate (FPR) in the X-axis, and True Positive Rate (TPR) in the y-axis.

Comments

Popular posts from this blog

Difference between abstract class and interface in OOP

Source: Amit Sethi In Interface: > All variables must be public static final. > No constructors. An interface can not be instantiated using the new operator.   > All methods must be public abstract .  

DFS Performance Measurement

Completeness DFS is not complete, to convince yourself consider that our search start expanding the left subtree of the root for so long path (maybe infinite) when different choice near the root could lead to a solution, now suppose that the left subtree of the root has no solution, and it is unbounded, then the search will continue going deep infinitely, in this case , we say that DFS is not complete. Optimality  Consider the scenario that there is more than one goal node, and our search decided to first expand the left subtree of the root where there is a solution at a very deep level of this left subtree , in the same time the right subtree of the root has a solution near the root, here comes the non-optimality of DFS that it is not guaranteed that the first goal to find is the optimal one, so we conclude that DFS is not optimal. Time Complexity Consider a state space that is identical to that of BFS, with branching factor b, and we start the search from th

Difference between a Singly LinkedList and Doubly LinkedList