Welcome to our Support Center

RemovePadding

Description

Compress transformer input by removing paddings. It assumes padding is on the right side of sequence. The input has padding with shape (batch_size, sequence_length, hidden_size). This will generate two outputs: output has shape (total_tokens, hidden_size); token_offset with shape (batch_size, sequence_length). token_offset has offsets of all non-padding tokens first, then offset of all padding tokens. It is a list of batch_size * sequence_length elements, which is reshaped to 2D for convenience of shape inference.

 

Input parameters

 

specified_outputs_namearray, this parameter lets you manually assign custom names to the output tensors of a node.

 Graphs in : cluster, ONNX model architecture.

input (heterogeneous) – T : object, input tensor with shape (batch_size, sequence_length, hidden_size).
sequence_token_count (heterogeneous) – M : object, number of non-padding tokens in each sequence with shape (batch_size).

 Parameters : cluster,

 training? : boolean, whether the layer is in training mode (can store data for backward).
Default value “True”.
 lda coeff : float, defines the coefficient by which the loss derivative will be multiplied before being sent to the previous layer (since during the backward run we go backwards).
Default value “1”.

 name (optional) : string, name of the node.

Output parameters

 Graphs out : cluster, ONNX model architecture.

output (heterogeneous) – T : object, output tensor with shape (total_tokens, hidden_size).
token_offset (heterogeneous) – M : object, offset of non-padding tokens, and those of padding tokens. Its shape is (batch_size, sequence_length).
cumulated_seq_len (heterogeneous) – M : object, cumulated sequence lengths. Its shape is (batch_size + 1).
max_seq_len (heterogeneous) – M : object, max sequence length without padding. Its shape is (1).

Type Constraints

T in (tensor(float)tensor(float16)) : Constrain input and output types to float tensors.

M in (tensor(int32)) : Constrain sequence_token_count and token_offset to integer types.

Example

All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install Deep Learning library to run it).
Table of Contents