class MXNet::Gluon::Trainer

Overview

Applies an Optimizer on a set of Parameters.

Trainer should be used together with Autograd.

Defined in:

mxnet/gluon/trainer.cr

Constructors

Instance Method Summary

Constructor Detail

def self.new(params, optimizer, **optimizer_params) #

Creates a new instance.

Parameters

  • params (ParameterDict) The set of parameters to optimize.
  • optimizer (Optimizer) The optimizer to use.
  • optimizer_params (NamedTuple) Key-word arguments to be passed to optimizer constructor. See each Optimizer for a list of additional supported arguments common to all optimizers.

[View source]

Instance Method Detail

def learning_rate #

[View source]
def step(batch_size) #

Makes one step of parameter update.

This should be called after Autograd#backward and outside of Autograd.record.

Parameters

  • batch_size (Int) Batch size of data processed. Gradient will be normalized by 1/batch_size. Set this to 1 if you normalized loss manually with loss = mean(loss).

[View source]
def update(batch_size) #

Makes one step of parameter update.

This should be called after Autograd#backward and outside of Autograd.record.

Parameters

  • batch_size (Int) Batch size of data processed. Gradient will be normalized by 1/batch_size. Set this to 1 if you normalized loss manually with loss = mean(loss).

[View source]
def weight_decay #

[View source]