Welcome to our Support Center

PRelu

Description

PRelu takes input data (Tensor) and slope tensor as input, and produces one output data (Tensor) where the function f(x) = slope * x for x < 0f(x) = x for x >= 0., is applied to the data tensor elementwise. This operator supports unidirectional broadcasting (tensor slope should be unidirectional broadcastable to input tensor X); for more details please check Broadcasting in ONNX.

 

Input parameters

 

specified_outputs_namearray, this parameter lets you manually assign custom names to the output tensors of a node.

 Graphs in : cluster, ONNX model architecture.

X (heterogeneous) – T : object, input tensor.
slope (heterogeneous) – T : object, slope tensor. The shape of slope can be smaller than first input X; if so, its shape must be unidirectional broadcastable to X.

 Parameters : cluster,

 training? : boolean, whether the layer is in training mode (can store data for backward).
Default value “True”.
 lda coeff : float, defines the coefficient by which the loss derivative will be multiplied before being sent to the previous layer (since during the backward run we go backwards).
Default value “1”.

 name (optional) : string, name of the node.

Output parameters

 

 Y (heterogeneous) – T : object, output tensor (same size as X).

Type Constraints

T in (tensor(bfloat16)tensor(double)tensor(float)tensor(float16)tensor(int32)tensor(int64)tensor(uint32)
tensor(uint64)) : Constrain input and output types to float/int tensors.

Example

All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install Deep Learning library to run it).
Table of Contents