Welcome to our Support Center

Activations resume

ADD TO GRAPH

In this section you’ll find a list of all add to graph activations available.

  ICONS RESUME
ELU Setup and add elu layer into the model during the definition graph step.
Exponential Setup and add exponential layer into the model during the definition graph step.
GELU Setup and add gelu layer into the model during the definition graph step.
HardSigmoid Setup and add hard sigmoid layer into the model during the definition graph step.
LeakyReLU Setup and add leaky relu layer into the model during the definition graph step.
Linear Setup and add linear layer into the model during the definition graph step.
PReLU Setup and add prelu layer into the model during the definition graph step.
ReLU Setup and add relu layer into the model during the definition graph step.
SELU Setup and add selu layer into the model during the definition graph step.
Sigmoid Setup and add sigmoid layer into the model during the definition graph step.
SoftMax Setup and add softmax layer into the model during the definition graph step.
SoftPlus Setup and add softplus layer into the model during the definition graph step.
SoftSign Setup and add softsign layer into the model during the definition graph step.
Swish Setup and add swish layer into the model during the definition graph step.
TanH Setup and add tanh layer into the model during the definition graph step.
ThresholdedReLU Setup and add thresholded relu layer into the model during the definition graph step.

DEFINE

In this section you’ll find a list of all define activations available (to use for the TimeDitributed layer). 

  ICONS RESUME
ELU Define the elu layer according to its parameters.
Exponential Define the exponential layer according to its parameters.
GELU Define the gelu layer according to its parameters.
HardSigmoid Define the hard sigmoid layer according to its parameters.
LeakyReLU Define the leaky relu layer according to its parameters.
Linear Define the linear layer according to its parameters.
PReLU Define the prelu layer according to its parameters.
ReLU Define the relu layer according to its parameters.
SELU Define the selu layer according to its parameters.
Sigmoid Define the sigmoid layer according to its parameters.
SoftMax Define the softmax layer according to its parameters.
SoftPlus Define the softplus layer according to its parameters.
SoftSign Define the softsign layer according to its parameters.
Swish Define the swish layer according to its parameters.
TanH Define the tanh layer according to its parameters.
ThresholdedReLU Define the thresholded relu layer according to its parameters.
Table of Contents