Welcome to our Support Center

Activations resume

ADD TO GRAPH

In this section you’ll find a list of all add to graph activations available.

 ICONSRESUME
ELUSetup and add elu layer into the model during the definition graph step.
ExponentialSetup and add exponential layer into the model during the definition graph step.
GELUSetup and add gelu layer into the model during the definition graph step.
HardSigmoidSetup and add hard sigmoid layer into the model during the definition graph step.
LeakyReLUSetup and add leaky relu layer into the model during the definition graph step.
LinearSetup and add linear layer into the model during the definition graph step.
PReLUSetup and add prelu layer into the model during the definition graph step.
ReLUSetup and add relu layer into the model during the definition graph step.
SELUSetup and add selu layer into the model during the definition graph step.
SigmoidSetup and add sigmoid layer into the model during the definition graph step.
SoftMaxSetup and add softmax layer into the model during the definition graph step.
SoftPlusSetup and add softplus layer into the model during the definition graph step.
SoftSignSetup and add softsign layer into the model during the definition graph step.
SwishSetup and add swish layer into the model during the definition graph step.
TanHSetup and add tanh layer into the model during the definition graph step.
ThresholdedReLUSetup and add thresholded relu layer into the model during the definition graph step.

DEFINE

In this section you’ll find a list of all define activations available (to use for the TimeDitributed layer). 

 ICONSRESUME
ELUDefine the elu layer according to its parameters.
ExponentialDefine the exponential layer according to its parameters.
GELUDefine the gelu layer according to its parameters.
HardSigmoidDefine the hard sigmoid layer according to its parameters.
LeakyReLUDefine the leaky relu layer according to its parameters.
LinearDefine the linear layer according to its parameters.
PReLUDefine the prelu layer according to its parameters.
ReLUDefine the relu layer according to its parameters.
SELUDefine the selu layer according to its parameters.
SigmoidDefine the sigmoid layer according to its parameters.
SoftMaxDefine the softmax layer according to its parameters.
SoftPlusDefine the softplus layer according to its parameters.
SoftSignDefine the softsign layer according to its parameters.
SwishDefine the swish layer according to its parameters.
TanHDefine the tanh layer according to its parameters.
ThresholdedReLUDefine the thresholded relu layer according to its parameters.
Table of Contents
Index