Confusion Matrix

Model evaluation technique  used to check the performance of the model

TRUE POSITIVE: Actually its a cat and the machine predicted the same.   FALSE-POSITIVE: Actually its a dog but the machine predicted it a cat. TRUE NEGATIVE: The machine correctly identified that image is incorrect. FALSE NEGATIVE: The machine incorrectly identified the image.

Type I vs Type II Error

EXAMPLE

RECALL

Ratio of a number of events you can correctly recall to a number of all correct events.

Let's say out of 20 times, 15 events are correct and 5 events are wrong. This means you can recall all events but it’s not so precise.

Eg: If you can recall all 10 events correctly, then, your recall ratio is 1.0 (100%). If you can recall 7 events, your recall ratio is 0.7 (70%).

PRECISION

Ratio of a number of events you can correctly recall to a number all events you recall

From previous example, 15 events are correct and 5 events are wrong. Here recall is 100% but precision is 75% (15/20).

From previous example, 15 events are correct and 5 events are wrong. Here recall is 100% but precision is 75% (15/20).

Bad product recommendation -> less conversion - > decreased  sales

Bad defect detector -> Bad Quality check --> customer dissatisfaction

PRECISION GOAL

RECALL GOAL

NOTE: Both the metrics are used when actual negatives are less relevant.

ACCURACY

Accuracy tells about how many correct predictions were made in the entire dataset.

Precision is about how close attempts are to each other, and accuracy is about how close they are to the target.

if you want to read the article in depth