Welcome to our Support Center

TimeDistributed

Description

Setup and add the time distributed layer into the model during the definition graph step. Type : polymorphic.

 

Input parameters

 

Graph in : model architecture.
layer : layer instance.

 in/out param :

 input_shape : integer array, shape (not including the batch axis). NB : To be used only if it is the first layer of the model.
 output_behavior : enum, setup if the layer is an output layer.
Default “Not Output”.

name (optional) : string, name of the layer.

 

Output parameters

 

Graph out : model architecture.

Dimension

Input shape

Input tensor of shape (batch, time, …).

 

Output shape

Same as input shape.

Example

All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install HAIBAL library to run it).

TimeDistributed layer with explicit input layer

1 – Generate a set of data

We generate an array of data of type single and shape [batch_size = 10, time = 6, input_dim = 5].

2 – Define graph

First, we define the first layer of the graph which is an Input layer (explicit input layer method). This layer is setup as an input array shaped [time = 6, input_dim = 5].
Then, we add to the graph the TimeDistributed layer which we setup with a Dense layer using the define method.

3 – Run graph

We call the forward method and retrieve the result with the “Prediction 3D” method.
This method returns two variables, the first one is the layer information (cluster composed of the layer name, the graph index and the shape of the output layer) and the second one is the prediction with a shape of [batch_size, time, units].

 

TimeDistributed layer with implicit input layer

1 – Generate a set of data

We generate an array of data of type single and shape [batch_size = 10, time = 6, input_dim = 5].

2 – Define graph

First, we define the TimeDistributed layer as the input layer of the graph (implicit input layer method). To do this, we send in the “input_shape” variable of the “in/out param” cluster an array of shape [time = 6, input_dim = 5].
An input layer will be implicitly created and the name of this input layer will be the same name as its parent prefixed with “input_”.
Then, we add to the graph the TimeDistributed layer which we setup with a Dense layer using the define method.

3 – Run graph

We call the forward method and retrieve the result with the “Prediction 3D” method.
This method returns two variables, the first one is the layer information (cluster composed of the layer name, the graph index and the shape of the output layer) and the second one is the prediction with a shape of [batch_size, time, units].

 

TimeDistributed layer, batch and dimension

1 – Generate a set of data

We generate an array of data of type single and shape [number of batch = 9, batch_size = 10, time = 6, input_dim = 5].

2 – Define graph

First, we define the first layer of the graph which is an Input layer (explicit input layer method). This layer is setup as an input array shaped [time = 6, input_dim = 5].
Then, we add to the graph the TimeDistributed layer which we setup with a Dense layer using the define method.

3 – Run graph

We call the forward method and retrieve the result with the “Prediction 3D” method.
This method returns two variables, the first one is the layer information (cluster composed of the layer name, the graph index and the shape of the output layer) and the second one is the prediction with a shape of [batch_size, time, units].

 

Table of Contents