Welcome to our Support Center

KLDivergence

Description

Computes Kullback-Leibler divergence loss between y_true and y_pred.​ Type : polymorphic.

 

 

 

Input parameters

 

Parameters : cluster,

reduction : enum, type of reduction to apply to the loss. In almost all cases this should be “Sum over Batch“.
sample weights : boolean, if enabled, adds an input for weighting each sample individually.

 

Output parameters

 

Loss : cluster, this cluster defines the loss function used for model training.

enum : enum, an enumeration indicating the loss type (e.g., MSE, CrossEntropy, etc.). If enum is set to CustomLoss, the custom class on the right will be used as the loss function. Otherwise, the selected loss will be applied with its default configuration.
 Class : object, a custom loss class instance.

Required data

 y_pred : array, predicted values.
 y_true : array, true values.

Use cases

KLDivergence loss, or Kullback-Leibler divergence, is a loss function used to measure how one probability distribution differs from a reference distribution. It is often used in tasks involving probabilistic models such as variational neural networks and language models, where the goal is to align a predicted distribution with a target distribution.

This function is particularly useful for problems like data compression, improving generative models, or optimizing dimensionality reduction techniques.

Example

All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install Deep Learning library to run it).

Table of Contents