Welcome to our Support Center

CategoricalHinge

Description

Computes the categorical hinge loss between y_true and y_pred.​ Type : polymorphic.

 

 

 

Input parameters

 

Parameters : cluster,

reduction : enum, type of reduction to apply to the loss. In almost all cases this should be “Sum over Batch“.
sample weights : boolean, if enabled, adds an input for weighting each sample individually.

 

Output parameters

 

Loss : cluster, this cluster defines the loss function used for model training.

enum : enum, an enumeration indicating the loss type (e.g., MSE, CrossEntropy, etc.). If enum is set to CustomLoss, the custom class on the right will be used as the loss function. Otherwise, the selected loss will be applied with its default configuration.
 Class : object, a custom loss class instance.

Required data

 y_pred : array, predicted values (one hot probabilities for example, [0.1, 0.3, 0.6] for 3-class problem).
 y_true : array, true values (one hot for example, [0, 0, 1] for 3-class problem).

Use cases

Categorical hinge loss is a loss function primarily used in multiclass classification problems. Similar to the hinge loss used in binary classification tasks, it is adapted to handle multiple classes.

This function compares the correct prediction to the highest prediction among the incorrect classes. It is often used in models that require a decision margin, such as certain types of Support Vector Machine (SVM) networks.

Example

All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install Deep Learning library to run it).

Table of Contents