Welcome to our Support Center



Computes the crossentropy metric between the labels and predictions. Type : polymorphic.



Input parameters


Β y_pred : array,Β predicted values (if from_logits = true then y_pred is logits data else is probabilities between 0 and 1).
Β y_true : array,Β true values (binary numerical label [ 0 ], [ 1 ]).
from_logits : boolean, whether output is expected to be a logits tensor.


Output parameters


binary_crossentropy : float, result.

Use cases

The binary crossentropy metric is a loss function used in machine learning, particularly in binary classification problems. Cross entropy measures the performance of a classification model whose outputs are probabilities between 0 and 1.

It is commonly used in many fields, including :

    • Recommendation systems : to predict whether a user will like a product or not.
    • Fraud detection : to predict whether a transaction is fraudulent or not.
    • Medicine : to predict whether a patient has a disease or not.



This is the crossentropy metric class to be used when there are only two label classes (0 and 1).


All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install HAIBAL library to run it).

Easy to use from logits

Easy to use from probalities

Table of Contents