-
Quick start
-
API
-
-
-
- Get All Grad
- Get Grad by index
- Get Grad by name
- Get All Store Grad
- Get Store Grad by index
- Get Store Grad by name
- Get All Index/Name
- Get Index by name
- Get Name by index
- Get All "lda_coeff"
- Get "lda_coeff" by index
- Get "lda_coeff" by name
- Get All Layer Params
- Get Layer Params by index
- Get Layer Params by name
- Get All Opti Params
- Get Opti Params by index
- Get Opti Params by name
- Get All Train Status
- Get Train Status by index
- Get Train Status by name
- Get All Loss Type
- Get Model Name
- Get Platform
- Warning Param
- Get All Input Layer Shape
- Get All Output Layer Shape
- Get All Input Shape
- Get Input Shape by index
- Get Input Shape by name
- Get All Output Shape
- Get Output Shape by index
- Get Output Shape by name
- Get All Init Weight
- Get Init Weight by index
- Get Init Weight by name
- Get All Weights
- Get Weights by index
- Get Weights by name
- Get All Weights Shape
- Get Weights Shape by index
- Get Weights Shape by name
- Get All Update Weights
- Get Update Weights by index
- Get Update Weights by name
- Show All Articles ( 30 ) Collapse Articles
-
- Set All Store Grad
- Set Store Grad by index
- Set Store Grad by name
- Set All "lda_coeff"
- Set "lda_coeff" by index
- Set "lda_coeff" by name
- Set All Opti Params
- Set Opti Params by index
- Set Opti Params by name
- Set All Train Status
- Set Train Status by index
- Set Train Status by name
- Set All Loss Type
- Set Model Name
- Set Platform
- Warning Param
- Set All Update Weights
- Set Update Weights by index
- Set Update Weights by name
- Load All Weights
- Load All Weights Model
- Set All Random Weights
- Set Weights by index
- Set Weights by name
- Show All Articles ( 9 ) Collapse Articles
-
-
-
-
- PReLU 2D
- PReLU 3D
- PReLU 4D
- PReLU 5D
- AdditiveAttention
- Attention
- MutiHeadAttention
- Conv1D
- Conv2D
- Conv3D
- ConvLSTM1D
- ConvLSTM2D
- ConvLSTM3D
- Conv1DTranspose
- Conv2DTranspose
- Conv3DTranspose
- DepthwiseConv2D
- SeparableConv1D
- SeparableConv2D
- Dense
- Embedding
- BatchNormalization
- LayerNormalization
- Bidirectional
- GRU
- LSTM
- SimpleRNN
- Show All Articles ( 12 ) Collapse Articles
-
- PReLU 2D
- PReLU 3D
- PReLU 4D
- PReLU 5D
- AdditiveAttention
- Attention
- MultiHeadAttention
- Conv1D
- Conv2D
- Conv3D
- ConvLSTM1D
- ConvLSTM2D
- ConvLSTM3D
- Conv1DTranspose
- Conv2DTranspose
- Conv3DTranspose
- DepthwiseConv2D
- SeparableConv1D
- SeparableConv2D
- Dense
- Embedding
- BatchNormalization
- LayerNormalization
- Bidirectional
- GRU
- LSTM
- SimpleRNN
- Show All Articles ( 12 ) Collapse Articles
-
-
- PReLU 2D
- PReLU 3D
- PReLU 4D
- PReLU 5D
- AdditiveAttention
- Attention
- MultiHeadAttention
- Conv1D
- Conv2D
- Conv3D
- ConvLSTM1D
- ConvLSTM2D
- ConvLSTM3D
- Conv1DTranspose
- Conv2DTranspose
- Conv3DTranspose
- DepthwiseConv2D
- SeparableConv1D
- SeparableConv2D
- Dense
- Embedding
- BatchNormalization
- LayerNormalization
- Bidirectional
- GRU
- LSTM
- SimpleRNN
- Show All Articles ( 12 ) Collapse Articles
-
-
- PReLU 2D
- PReLU 3D
- PReLU 4D
- PReLU 5D
- AdditiveAttention
- Attention
- MultiHeadAttention
- Conv1D
- Conv2D
- Conv3D
- ConvLSTM1D
- ConvLSTM2D
- ConvLSTM3D
- Conv1DTranspose
- Conv2DTranspose
- Conv3DTranspose
- DepthwiseConv2D
- SeparableConv1D
- SeparableConv2D
- Dense
- Embedding
- BatchNormalization
- LayerNormalization
- Bidirectional
- GRU
- LSTM
- SimpleRNN
- Show All Articles ( 12 ) Collapse Articles
-
- PReLU 2D
- PReLU 3D
- PReLU 4D
- PReLU 5D
- AdditiveAttention
- Attention
- MultiHeadAttention
- Conv1D
- Conv2D
- Conv3D
- ConvLSTM1D
- ConvLSTM2D
- ConvLSTM3D
- Conv1DTranspose
- Conv2DTranspose
- Conv3DTranspose
- DepthwiseConv2D
- SeparableConv1D
- SeparableConv2D
- Dense
- Embedding
- BatchNormalization
- LayerNormalization
- Bidirectional
- GRU
- LSTM
- SimpleRNN
- Show All Articles ( 12 ) Collapse Articles
-
-
-
-
- Add
- AdditiveAttention
- AlphaDropout
- Attention
- Average
- AvgPool1D
- AvgPool2D
- AvgPool3D
- BatchNormalization
- Bidirectional
- Concatenate
- Conv1D
- Conv1DTranspose
- Conv2D
- Conv2DTranspose
- Conv3D
- Conv3DTranspose
- ConvLSTM1D
- ConvLSTM2D
- ConvLSTM3D
- Cropping1D
- Cropping2D
- Cropping3D
- Dense
- DepthwiseConv2D
- Dropout
- Embedding
- Flatten
- GaussianDropout
- GaussianNoise
- GlobalAvgPool1D
- GlobalAvgPool2D
- GlobalAvgPool3D
- GlobalMaxPool1D
- GlobalMaxPool2D
- GlobalMaxPool3D
- GRU
- Input
- LayerNormalization
- LSTM
- MaxPool1D
- MaxPool2D
- MaxPool3D
- MultiHeadAttention
- Multiply
- Permute3D
- Reshape
- RNN
- SeparableConv1D
- SeparableConv2D
- SimpleRNN
- SatialDropout
- Substract
- TimeDistributed
- UpSampling1D
- UpSampling2D
- UpSampling3D
- ZeroPadding1D
- ZeroPadding2D
- ZeroPadding3D
- Show All Articles ( 45 ) Collapse Articles
-
- AlphaDropout
- AvgPool1D
- AvgPool2D
- AvgPool3D
- BatchNormalization
- Bidirectional
- Conv1D
- Conv1DTranspose
- Conv2D
- Conv2DTranspose
- Conv3D
- Conv3DTranspose
- Cropping1D
- Cropping2D
- Cropping3D
- Dense
- DepthwiseConv2D
- Dropout
- Embedding
- Flatten
- GaussianDropout
- GaussianNoise
- GlobalAvgPool1D
- GlobalAvgPool2D
- GlobalAvgPool3D
- GlobalMaxPool1D
- GlobalMaxPool2D
- GlobalMaxPool3D
- GRU
- LayerNormalization
- LSTM
- MaxPool1D
- MaxPool2D
- MaxPool3D
- Permute3D
- Reshape
- RNN
- SeparableConv1D
- SeparableConv2D
- SimpleRNN
- SpatialDropout
- UpSampling1D
- UpSampling2D
- UpSampling3D
- ZeroPadding1D
- ZeroPadding2D
- ZeroPadding3D
- Show All Articles ( 32 ) Collapse Articles
-
-
-
Get All Loss Type
Description
Gets the information related to the loss of each layer contained in the model.

Input parameters
Model in : model architecture.
Output parameters
Model out : model architecture.

index : integer, index of layer.
name : string, name of layer.
output_order : integer, output number.
loss_type : enum, name of the losse used by the layer.
loss_axis : string, the axis on which the losse performs its calculation.

Example
All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install HAIBAL library to run it).
Using the “Get All Loss Type” function

1 – Define Graph
We define two graphs with one input and two dense layers. In the first graph the dense layers are named Dense1 and Dense2. In the second one we named the layers Dense3 and Dense4.
2 – Merge Function
We use the “Merge” function to merge the two graphs.
3 – Set Function
The “Set All Loss Type” function is used to define the loss to be applied to each graph. For the first graph we will apply the “MeanSquare” loss on axis 1 and for the second we will apply the “BinaryCrossentropy” loss on axis 0.
2 – Get Function
We use the function “Get All Loss Type” to get all the loss applied to the model.