Welcome to our Support Center

KLDivergence

Description

Computes Kullback-Leibler divergence loss between y_true and y_pred.​ Type : polymorphic.

 

 

 

Input parameters

 

KLDivergence in : class
reduction : enum, type of reduction to apply to the loss. In almost all cases this should be “Sum over Batch“.

 

Output parameters

 

KLDivergence out : class

Required data

 y_pred : array, predicted values.
 y_true : array, true values.

Use cases

KLDivergence loss, or Kullback-Leibler divergence, is a loss function used to measure how one probability distribution differs from a reference distribution. It is often used in tasks involving probabilistic models such as variational neural networks and language models, where the goal is to align a predicted distribution with a target distribution.

This function is particularly useful for problems like data compression, improving generative models, or optimizing dimensionality reduction techniques.

Example

All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install HAIBAL library to run it).

Table of Contents