Welcome to our Support Center

LogSoftmax

Description

The operator computes the log of softmax values for the given input : LogSoftmax(input, axis) = Log(Softmax(input, axis=axis))
The “axis” attribute indicates the dimension along which LogSoftmax will be performed. The output tensor has the same shape and contains the LogSoftmax values of the corresponding input.

 

Input parameters

 

specified_outputs_namearray, this parameter lets you manually assign custom names to the output tensors of a node.
 input (heterogeneous) – T : object, the input tensor of rank >= axis.

 Parameters : cluster,

 axis : integer, describes the dimension LogSoftmax will be performed on. Negative value means counting dimensions from the back. Accepted range is [-r, r-1] where r = rank(input).
Default value “0”.
 training? : boolean, whether the layer is in training mode (can store data for backward).
Default value “True”.
 lda coeff : float, defines the coefficient by which the loss derivative will be multiplied before being sent to the previous layer (since during the backward run we go backwards).
Default value “1”.

 name (optional) : string, name of the node.

Output parameters

 

 output (heterogeneous) – T : object, the output values with the same shape as the input tensor.

Type Constraints

T in (tensor(bfloat16)tensor(double)tensor(float)tensor(float16)) : Constrain input and output types to float tensors.

Example

All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install Deep Learning library to run it).
Table of Contents