Welcome to our Support Center

initializer

initializer : enum, initializer for the weights

 

Constant

Initializer that generates tensors with constant values.

GlorotNormal

The Glorot normal initializer, also called Xavier normal initializer. Draws samples from a truncated normal distribution centered on 0 with stddev = sqrt(2 / (fan_in + fan_out)) where fan_in is the number of input units in the weight tensor and fan_out is the number of output units in the weight tensor.

GlorotUniform

The Glorot uniform initializer, also called Xavier uniform initializer. Draws samples from a uniform distribution within [-limit, limit], where limit = sqrt(6 / (fan_in + fan_out)) (fan_in is the number of input units in the weight tensor and fan_out is the number of output units).

HeNormal

It draws samples from a truncated normal distribution centered on 0 with stddev = sqrt(2 / fan_in) where fan_in is the number of input units in the weight tensor.

HeUniform

Draws samples from a uniform distribution within [-limit, limit], within limit = sqrt(6 / fan_in) (fan_in is the number of input units in the weight tensor).

Identity

Only usable for generating 2D matrices.

LecunNormal

Initializers allow you to pre-specify an initialization strategy, encoded in the Initializer object, without knowing the shape and dtype of the variable being initialized.
Draws samples from a truncated normal distribution centered on 0 with stddev = sqrt(1 / fan_in) where fan_in is the number of input units in the weight tensor.

LecunUniform

Draws samples from a uniform distribution within [-limit, limit], where limit = sqrt(3 / fan_in) (fan_in is the number of input units in the weight tensor).

Ones

Initializer that generates tensors initialized to 1.​

Orthogonal

Initializer that generates an orthogonal matrix. If the shape of the tensor to initialize is two-dimensional, it is initialized with an orthogonal matrix obtained from the QR decomposition of a matrix of random numbers drawn from a normal distribution. If the matrix has fewer rows than columns then the output will have orthogonal rows. Otherwise, the output will have orthogonal columns.
If the shape of the tensor to initialize is more than two-dimensional, a matrix of shape (shape[0] * … * shape[n – 2], shape[n – 1 ]) is initialized, where n is the length of the shape vector. The matrix is subsequently reshaped to give a tensor of the desired shape.

RandomNormal

Draws samples from a uniform distribution for given parameters.

RandomUniform

Draws samples from a uniform distribution for given parameters.

TruncatedNormal

Initializer that generates a truncated normal distribution. The values generated are similar to values from a RandomNormal initializer except that values more than two standard deviations from the mean are discarded and re-drawn.

VarianceScaling

Initializer that adapts its scale to the shape of its input tensors. With distribution = “truncated_normal” or “untruncated_normal”, samples are drawn from a truncated/untruncated normal distribution with a mean of zero and a standard deviation (after truncation, if used) stddev = sqrt(scale / n), where n is:

  • number of input units in the weight tensor, if mode = “fan_in”
  • number of output units, if mode = “fan_out”
  • average of the numbers of input and output units, if mode = “fan_avg”

With distribution = “uniform”, samples are drawn from a uniform distribution within [-limit, limit], where limit = sqrt(3 * scale / n).

Zeros

Initializer that generates tensors initialized to 0.

This parameter is used in add_to_graph an define VIs of the BatchNormalization, Conv1D, Conv1DTranspose, Conv2D, Conv2DTranspose, Conv3D, Conv3DTranspose, Dense, DepthwiseConv2D, Embedding, GRU, LayerNormalization, LSTM, MultiHeadAttention, SeparableConv1D, SeparableConv2D, SimpleRNN, PReLU, ConvLSTM1DCell, ConvLSTM2DCell, ConvLSTM3DCell, GRUCell, LSTMCell, SimpleRNNCell layers.

Table of Contents