Welcome to our Support Center



Computes the crossentropy metric between the labels and predictions. Type : polymorphic.



Input parameters


Β y_pred : array,Β predicted values (if from_logits = true then one hot logits for example, [0.1, 0.8, 0.9] else one hot probabilities for example, [0.1, 0.3, 0.6] for 3-class problem).
Β y_true : array,Β true values (one hot for example, [0, 0, 1] for 3-class problem).
from_logits : boolean, whether y_pred is expected to be a logits tensor.
axis : integer, the dimension along which entropy is computed.


Output parameters


categorical_crossentropy : float, result.

Use cases

The categorical crossentropy metric, is a loss function used for multiclass classification problems in machine learning. It is specifically used for multiclass classification problems, where a target variable can take on more than two values. For example, it could be used to train a model to classify animal images into different categories (cat, dog, horse, etc.) or to classify text documents into different genres (fiction, non-fiction, poetry, etc.).


This is the crossentropy metric class to be used when there are multiple label classes (2 or more). It measures the performance of a classification model whose output is a probability between 0 and 1.Β 


All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install HAIBAL library to run it).

Easy to use from logits

Easy to use from probabilities

Table of Contents