Welcome to our Support Center

PrecisionAtRecall

Description

Computes best precision where recall is > specified value. Type : polymorphic.

 

 

Input parameters

 

 y_pred : array, predicted values.
 y_true : array, true values.
recall : float, a scalar value in range [0, 1].
num_thresholds : integer, the number of thresholds to use for matching the given recall.

 

Output parameters

 

precision_at_recall : float, result.

Use cases

The Precision at Recall metric is generally used in binary and multiclass classification tasks in machine learning, particularly in information retrieval and recommendation systems, as well as in object detection.

Precision at Recall is a way of evaluating a model in terms of its precision (how many of the positive examples predicted by the model are actually positive) at a certain level of recall (how many of the actual positive examples the model is able to capture).

Here are some specific areas where Precision at Recall can be used :

    • Information retrieval and recommendation systems : in these systems, the aim is generally to retrieve the greatest number of relevant items (high precision) while covering a large proportion of all available relevant items (high recall). For example, in a search engine, you want the first results to be highly relevant (high precision), but you also want most relevant documents to appear somewhere in the results list (high recall).
    • Object detection in images : in this field, object detectors are often evaluated using a precision-recall curve, which shows the detector’s precision at different levels of recall.
    • Text classification : in tasks such as spam detection or content moderation, Precision at Recall can help balance the need to filter out as much unwanted content as possible (high precision) while avoiding marking legitimate content as unwanted (high recall).

 

Calculation

The PrecisionAtRecall metric is used to evaluate the performance of classification models. It calculates precision, i.e. the ratio of true positives to all positive predictions, at a specified recall level. Recall is the ratio of true positives to the sum of true positives and false negatives.
To calculate this metric, a number of thresholds (num_thresholds) are used. For each threshold, calculated as i / (num_thresholds – 1) where i ranges from 0 to num_thresholds, precision and recall are calculated.
Then, when recall reaches or exceeds the specified value (recall), the highest precision obtained among all the thresholds is retained.

This metric offers a balance between precision and recall.

 

Example

All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install HAIBAL library to run it).

Easy to use

Table of Contents