Welcome to our Support Center

LayerNormalization

Description

Defines the weights of the LayerNormalization layer selected by the name. Type : polymorphic.

 

Input parameters

 

 Model in : model architecture.
 name : stringname of layer.
 gamma : array, 1D values. gamma = [input_dim1].
 beta : array, 1D values. beta = [input_dim1].

 

Output parameters

 

 Model out : model architecture.

Dimension

  • gamma = [input_dim1]

The size depends on the input to the LayerNormalization layer.
For example, if the layer input has a size of [batch_size = 10, input_dim1 = 5, input_dim2 = 4, input_dim3 = 2] then gamma will have a size of [input_dim1 = 5].
Another example, if the input of the layer has a size of [batch_size = 12, input_dim1 = 8, input_dim2 = 5, input_dim3 = 3] then gamma will have a size of [input_dim1 = 8].

 

  • beta = [input_dim1]

The beta size is based on the same principle as the gamma size.

Example

All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install HAIBAL library to run it).

Table of Contents