Welcome to our Support Center

WordConvEmbedding

Description

The WordConvEmbedding takes in a batch of sequence words and embed each word to a vector.

 

Input parameters

 

specified_outputs_namearray, this parameter lets you manually assign custom names to the output tensors of a node.

 Graphs in : cluster, ONNX model architecture.

Sequence (heterogeneous) – T : object, specify batchs of sequence words to embedding.
W (heterogeneous) – T1 : object, specify weights of conv.
B (heterogeneous) – T1 : object, specify bias of conv.
C (heterogeneous) – T1 : object, specify embedding vector of char.

 Parameters : cluster,

char_embedding_size : integer, integer representing the embedding vector size for each char.If not provide, use the char embedding size of embedding vector.
Default value “0”.
conv_window_size : integer, this operator applies convolution to word from left to right with window equal to conv_window_size and stride to 1.Take word ‘example’ for example, with conv_window_size equal to 2, conv is applied to [ex],[xa], [am], [mp]…If not provide, use the first dimension of conv kernel shape.
Default value “0”.
embedding_size : integer, integer representing the embedding vector size for each word.If not provide, use the filter size of conv weight.
Default value “0”.
 training? : boolean, whether B should be transposed on the last two dimensions before doing multiplication.
Default value “True”.
 lda coeff : float, defines the coefficient by which the loss derivative will be multiplied before being sent to the previous layer (since during the backward run we go backwards).
Default value “1”.

 name (optional) : string, name of the node.

Output parameters

 

Y (heterogeneous) – T1 : object, output.

Type Constraints

T in (tensor(int32)) : Constrain to tensor(int32).

T1 in (tensor(float)) : Constrain to tensor(float).

Example

All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install Deep Learning library to run it).
Table of Contents