-
Quick start
-
API
-
-
-
- Get All Grad
- Get Grad by index
- Get Grad by name
- Get All Store Grad
- Get Store Grad by index
- Get Store Grad by name
- Get All Index/Name
- Get Index by name
- Get Name by index
- Get All "lda_coeff"
- Get "lda_coeff" by index
- Get "lda_coeff" by name
- Get All Layer Params
- Get Layer Params by index
- Get Layer Params by name
- Get All Opti Params
- Get Opti Params by index
- Get Opti Params by name
- Get All Train Status
- Get Train Status by index
- Get Train Status by name
- Get All Loss Type
- Get Model Name
- Get Platform
- Warning Param
- Get All Input Layer Shape
- Get All Output Layer Shape
- Get All Input Shape
- Get Input Shape by index
- Get Input Shape by name
- Get All Output Shape
- Get Output Shape by index
- Get Output Shape by name
- Get All Init Weight
- Get Init Weight by index
- Get Init Weight by name
- Get All Weights
- Get Weights by index
- Get Weights by name
- Get All Weights Shape
- Get Weights Shape by index
- Get Weights Shape by name
- Get All Update Weights
- Get Update Weights by index
- Get Update Weights by name
- Show All Articles ( 30 ) Collapse Articles
-
- Set All Store Grad
- Set Store Grad by index
- Set Store Grad by name
- Set All "lda_coeff"
- Set "lda_coeff" by index
- Set "lda_coeff" by name
- Set All Opti Params
- Set Opti Params by index
- Set Opti Params by name
- Set All Train Status
- Set Train Status by index
- Set Train Status by name
- Set All Loss Type
- Set Model Name
- Set Platform
- Warning Param
- Set All Update Weights
- Set Update Weights by index
- Set Update Weights by name
- Load All Weights
- Load All Weights Model
- Set All Random Weights
- Set Weights by index
- Set Weights by name
- Show All Articles ( 9 ) Collapse Articles
-
-
-
-
- PReLU 2D
- PReLU 3D
- PReLU 4D
- PReLU 5D
- AdditiveAttention
- Attention
- MutiHeadAttention
- Conv1D
- Conv2D
- Conv3D
- ConvLSTM1D
- ConvLSTM2D
- ConvLSTM3D
- Conv1DTranspose
- Conv2DTranspose
- Conv3DTranspose
- DepthwiseConv2D
- SeparableConv1D
- SeparableConv2D
- Dense
- Embedding
- BatchNormalization
- LayerNormalization
- Bidirectional
- GRU
- LSTM
- SimpleRNN
- Show All Articles ( 12 ) Collapse Articles
-
- PReLU 2D
- PReLU 3D
- PReLU 4D
- PReLU 5D
- AdditiveAttention
- Attention
- MultiHeadAttention
- Conv1D
- Conv2D
- Conv3D
- ConvLSTM1D
- ConvLSTM2D
- ConvLSTM3D
- Conv1DTranspose
- Conv2DTranspose
- Conv3DTranspose
- DepthwiseConv2D
- SeparableConv1D
- SeparableConv2D
- Dense
- Embedding
- BatchNormalization
- LayerNormalization
- Bidirectional
- GRU
- LSTM
- SimpleRNN
- Show All Articles ( 12 ) Collapse Articles
-
-
- PReLU 2D
- PReLU 3D
- PReLU 4D
- PReLU 5D
- AdditiveAttention
- Attention
- MultiHeadAttention
- Conv1D
- Conv2D
- Conv3D
- ConvLSTM1D
- ConvLSTM2D
- ConvLSTM3D
- Conv1DTranspose
- Conv2DTranspose
- Conv3DTranspose
- DepthwiseConv2D
- SeparableConv1D
- SeparableConv2D
- Dense
- Embedding
- BatchNormalization
- LayerNormalization
- Bidirectional
- GRU
- LSTM
- SimpleRNN
- Show All Articles ( 12 ) Collapse Articles
-
-
- PReLU 2D
- PReLU 3D
- PReLU 4D
- PReLU 5D
- AdditiveAttention
- Attention
- MultiHeadAttention
- Conv1D
- Conv2D
- Conv3D
- ConvLSTM1D
- ConvLSTM2D
- ConvLSTM3D
- Conv1DTranspose
- Conv2DTranspose
- Conv3DTranspose
- DepthwiseConv2D
- SeparableConv1D
- SeparableConv2D
- Dense
- Embedding
- BatchNormalization
- LayerNormalization
- Bidirectional
- GRU
- LSTM
- SimpleRNN
- Show All Articles ( 12 ) Collapse Articles
-
- PReLU 2D
- PReLU 3D
- PReLU 4D
- PReLU 5D
- AdditiveAttention
- Attention
- MultiHeadAttention
- Conv1D
- Conv2D
- Conv3D
- ConvLSTM1D
- ConvLSTM2D
- ConvLSTM3D
- Conv1DTranspose
- Conv2DTranspose
- Conv3DTranspose
- DepthwiseConv2D
- SeparableConv1D
- SeparableConv2D
- Dense
- Embedding
- BatchNormalization
- LayerNormalization
- Bidirectional
- GRU
- LSTM
- SimpleRNN
- Show All Articles ( 12 ) Collapse Articles
-
-
-
-
- Add
- AdditiveAttention
- AlphaDropout
- Attention
- Average
- AvgPool1D
- AvgPool2D
- AvgPool3D
- BatchNormalization
- Bidirectional
- Concatenate
- Conv1D
- Conv1DTranspose
- Conv2D
- Conv2DTranspose
- Conv3D
- Conv3DTranspose
- ConvLSTM1D
- ConvLSTM2D
- ConvLSTM3D
- Cropping1D
- Cropping2D
- Cropping3D
- Dense
- DepthwiseConv2D
- Dropout
- Embedding
- Flatten
- GaussianDropout
- GaussianNoise
- GlobalAvgPool1D
- GlobalAvgPool2D
- GlobalAvgPool3D
- GlobalMaxPool1D
- GlobalMaxPool2D
- GlobalMaxPool3D
- GRU
- Input
- LayerNormalization
- LSTM
- MaxPool1D
- MaxPool2D
- MaxPool3D
- MultiHeadAttention
- Multiply
- Permute3D
- Reshape
- RNN
- SeparableConv1D
- SeparableConv2D
- SimpleRNN
- SatialDropout
- Substract
- TimeDistributed
- UpSampling1D
- UpSampling2D
- UpSampling3D
- ZeroPadding1D
- ZeroPadding2D
- ZeroPadding3D
- Show All Articles ( 45 ) Collapse Articles
-
- AlphaDropout
- AvgPool1D
- AvgPool2D
- AvgPool3D
- BatchNormalization
- Bidirectional
- Conv1D
- Conv1DTranspose
- Conv2D
- Conv2DTranspose
- Conv3D
- Conv3DTranspose
- Cropping1D
- Cropping2D
- Cropping3D
- Dense
- DepthwiseConv2D
- Dropout
- Embedding
- Flatten
- GaussianDropout
- GaussianNoise
- GlobalAvgPool1D
- GlobalAvgPool2D
- GlobalAvgPool3D
- GlobalMaxPool1D
- GlobalMaxPool2D
- GlobalMaxPool3D
- GRU
- LayerNormalization
- LSTM
- MaxPool1D
- MaxPool2D
- MaxPool3D
- Permute3D
- Reshape
- RNN
- SeparableConv1D
- SeparableConv2D
- SimpleRNN
- SpatialDropout
- UpSampling1D
- UpSampling2D
- UpSampling3D
- ZeroPadding1D
- ZeroPadding2D
- ZeroPadding3D
- Show All Articles ( 32 ) Collapse Articles
-
-
-
Summary
Description
Returns the summary of the model. Possibility to retrieve a cluster or a text with the information and save it in a file. This information consists of the successors, predecessors, the output shape, the layer and its name.

Input parameters
Model in : model architecture.
file_type : enum, type of the file on which the summary is written.
- None : returns the summary only in a cluster array.
- txt : returns the summary in a text file and cluster array. (default)
- csv : returns the summary in a comma-separated values (csv) file and cluster array.
summary_mode : enum, display mode of the summary.
- Simple : displays simple information (index, layer, input shape, output shape, predecessors, successors, parameters).
- Advanced : displays advanced information (index, layer, input shape, output shape, predecessors, successors, parameters, data format, init weight, weights shape).
- Complete : displays complete information (index, layer, input shape, output shape, predecessors, successors, parameters, data format, init weight, weights shape, boolean “training?”, boolean “update?”, boolean “store?”, loss derivative attenuation).
Output parameters
Model out : model architecture.
Summary : array
Layer : string, type and name of layer.
Input Shape : integer array, input size of the layer.
Output Shape : integer array, output size of the layer.
Successors : string array, the layer(s) connected to the output of the current layer.
Predecessors : string array, the layer(s) connected to the input of the current layer.

Example
All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install HAIBAL library to run it).
Simple summary of the model in a text file

1 – Define Graph
We define the graph with one input and two Dense layers named Dense1 and Dense2.
2 – Summary
We use the “Summary” function to have the model information in a text file.