Welcome to our Support Center

RecallAtPrecision

Description

Computes best recall where precision is > specified value. Type : polymorphic.

 

 

Input parameters

 

 y_pred : array, predicted values.
 y_true : array, true values.
precision : float, a scalar value in range [0, 1].
 num_thresholds : integer, the number of thresholds to use for matching the given recall.

 

Output parameters

 

recall_at_precision : float, result.

Use cases

The “Recall at Precision” metric is commonly used in binary and multiclass classification tasks, particularly in the fields of information retrieval, recommender systems and object detection.

“Recall at Precision” is a way of evaluating a model in terms of its recall (the proportion of true positives that are correctly identified) at a certain level of precision (the proportion of positive predictions that are correct).

Here are some specific areas where Recall at Precision can be used :

    • Information retrieval and recommendation systems : in these systems, you generally want to retrieve the greatest number of relevant items (high precision) while covering a large proportion of all available relevant items (high recall). For example, in a search engine, you want the first results to be highly relevant (high precision), but you also want most relevant documents to appear somewhere in the results list (high recall).
    • Object detection in images : in this field, object detectors are often evaluated using a precision-recall curve, which shows the detector’s recall at different precision levels.
    • Text classification : in tasks such as spam detection or content moderation, Recall at Precision can help balance the need to filter out as much unwanted content as possible (high precision) while avoiding marking legitimate content as unwanted (high recall).

 

Calculation

The RecallAtPrecision metric is used to evaluate the performance of classification models. It calculates recall, which is the ratio of true positives to the sum of true positives and false negatives, at a specified precision level. Precision is the ratio of true positives to all positive predictions.
To calculate this metric, a number of thresholds (num_thresholds) are used. For each threshold, calculated as i / (num_thresholds – 1) where i varies from 0 to num_thresholds, precision and recall are calculated.
Then, when precision reaches or exceeds the specified value (precision), the highest recall obtained among all the thresholds is retained.

This metric offers a balance between precision and recall.

 

Example

All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install HAIBAL library to run it).

Easy to use

Table of Contents