In pattern recognition, information retrieval, object detection and classification (machine learning), precision and recall are performance metrics that apply to data retrieved from a collection, corpus or sample space.
Predicted Positive Values - Values predicted to be positive
Real Positive Values - Values that are actually positive
Consider a computer program for recognising dogs (the relevant element) in a digital photograph. Upon processing a picture which contains ten cats and twelve dogs, the program identifies eight dogs. Of the eight elements identified as dogs, only five actually are dogs (true positives), while the other three are cats (false positives). Seven dogs were missed (false negatives), and seven cats were correctly excluded (true negatives). The program's precision is then 5/8 (true positives / Predicted Positive values) while its recall is 5/12 (true positives / Real Positive values).
Precision and recall can also be referred to as positive predicted value(PPV) and sensitivity(TPR). All of these would be properly explained.
Precision can be referred to as the number of real positive value predicted divided by the total "positively" predicted outcome. It explains how many true positive values out of the total values are predicted to be true.
TP/(TP + FP)
Precision explains how well your model has performed in terms of correctly predicting true value outcomes out of the seemingly true outcomes. The model would happened to have predicted some values to be true when it's clearly not.
That explains why it is called POSITIVE PREDICTED VALUE
Recall on the other hand explains how many True Positive values where predicted out of the total True Positive Values. The True positive outcome divided by the total True Positive Values.
TP/(TP + FN)
Recall further reveals how many True positive values were predicted to be false and it substantially measures how well a model has predicted true positive values from a range of Real Positive values.
Recall is therefore referred to as SENSITIVITY(True Positive Rate)
Top comments (0)