Welcome to our Support Center

AdditiveAttention

Description

Returns the AdditiveAttention layer weights. Type : polymorphic.

 

Input parameters

 

 weights : cluster

 index : integer, index of layer.
 name : string, name of layer.
 weight : variant, weight of layer.

Output parameters

 

 weights_info : cluster

 index : integer, index of layer.
 name : string, name of layer.
 weights : cluster

scale : array, 1D values. scale = query[2] = value[2] = key[2].

Dimension

  • scale = query[2] = value[2] = key[2]

The size of scale depends on the size of the query, value and key entries in the AdditiveAttention layer.
For example, if query has a size of [batch_size = 5, Tq = 3, dim = 1], value a size of [batch_size = 10, Tv = 4, dim = 1] and key a size of [batch_size = 8, Tv = 6, dim = 1] then the size of scale is [dim = 1].
Another example, if query has a size of [batch_size = 10, Tq = 9, dim = 5], value a size of [batch_size = 15, Tv = 10, dim = 5] and key a size of [batch_size = 9, Tv = 7, dim = 5] then the size of scale is [dim = 5].
query, value and key will always have the same value at index 2 of their size, which will be the size of scale.

Example

All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install HAIBAL library to run it).
Table of Contents