Welcome to our Support Center

LogCosh

Description

Computes the logarithm of the hyperbolic cosine of the prediction error.​ Type : polymorphic.

 

 

 

Input parameters

 

Parameters : cluster,

reduction : enum, type of reduction to apply to the loss. In almost all cases this should be “Sum over Batch“.
sample weights : boolean, if enabled, adds an input for weighting each sample individually.

 

Output parameters

 

Loss : cluster, this cluster defines the loss function used for model training.

enum : enum, an enumeration indicating the loss type (e.g., MSE, CrossEntropy, etc.). If enum is set to CustomLoss, the custom class on the right will be used as the loss function. Otherwise, the selected loss will be applied with its default configuration.
 Class : object, a custom loss class instance.

Required data

 y_pred : array, predicted values.
 y_true : array, true values.

Use cases

LogCosh loss, or logarithmic hyperbolic cosine loss, is a loss function used in regression tasks. It is designed as an alternative to mean squared error (MSE) and mean absolute error (MAE), offering an approach that combines reduced sensitivity to outliers, like MAE, with the smoothness and differentiability of MSE.

This loss function is particularly effective when data shows variability but without extreme outliers, making it useful in scenarios where robustness is key without sacrificing gradient accuracy.

Example

All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install Deep Learning library to run it).

Table of Contents