Welcome to our Support Center

AlphaDropout

Description

Define the alpha dropout layer according to its parameters. To be used for the TimeDistributed layer. Type : polymorphic.

 

Input parameters

 

 parameters : layer parameters.

rate : float, drop probability (as with Dropout). The multiplicative noise will have standard deviation sqrt(rate / (1 – rate)).
Default value “0”.
 training? : boolean, whether the layer is in training mode (can store data for backward).
Default value “True”.
 lda_coeff : float, defines the coefficient by which the loss derivative will be multiplied before being sent to the previous layer (since during the backward run we go backwards).
Default value “1”.

 

Output parameters

 

AlphaDropout out : layer alpha dropout architecture.

Example

All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install HAIBAL library to run it).

AlphaDropout layer inside TimeDistributed layer

1 – Generate a set of data

We generate an array of data of type single and shape [batch_size = 10, time = 6, input_dim = 5]

2 – Define graph

First, we define the first layer of the graph which is an Input layer (explicit input layer method). This layer is setup as an input array shaped [time = 6, input_dim = 5].
Then, we add to the graph the TimeDistributed layer which we setup with a AlphaDropout layer using the define method.

3 – Run graph

We call the forward method and retrieve the result with the “Prediction 3D” method.
This method returns two variables, the first one is the layer information (cluster composed of the layer name, the graph index and the shape of the output layer) and the second one is the prediction with a shape of [batch_size, time, input_dim ].

 

Table of Contents