Welcome to our Support Center

Adam

Description

Optimizer that implements the Adam algorithm. Adam optimization is a stochastic gradient descent method that is based on adaptive estimation of first-order and second-order moments. Type : polymorphic.

 

Input parameters

 

Model in : model architecture.

Parameters : cluster,

learning rate : float, the learning rate.
beta 1 : float, the exponential decay rate for the 1st moment estimates.
beta 2 : float, the exponential decay rate for the 2nd moment estimates.
weight_decay : float, if set, weight decay is applied.
epsilon : float, a small constant for numerical stability.
adam mode : enum, compatibility mode. Selects between different implementations of Adam depending on the framework. Options : Pytorch or HuggingFace.

Output parameters

 

Model out : model architecture.

Example

All these exemples are snippets PNG, you can drop these Snippet onto the block diagram and get the depicted code added to your VI (Do not forget to install Deep Learning library to run it).
Table of Contents