chainer.Optimizer

class chainer.Optimizer[source]

Base class of all numerical optimizers.

This class provides basic features for all optimization methods. It optimizes parameters of a target link. The target link is registered via the setup() method, and then the update() method updates its parameters based on a given loss function.

Each optimizer implementation must be defined as a child class of Optimizer. It must override update() method.

If the optimizer is based on single gradient computation (like most first-order methods), then it should inherit GradientMethod, which adds some features dedicated for the first order methods, including the support of UpdateRule.

Optimizer instance also supports hook functions. Hook function is registered by the add_hook() method. Each hook function is called in registration order before of after the actual parameter update (configurable). If the hook function has an attribute call_for_each_param and its value is True, the hook function is used as a hook function of all update rules (i.e., it is invoked for every parameter by passing the corresponding update rule and the parameter).

Variables
  • ~Optimizer.target – Target link object. It is set by the setup() method.

  • ~Optimizer.t – Number of update steps. It must be incremented by the update() method.

  • ~Optimizer.epoch – Current epoch. It is incremented by the new_epoch() method.

  • ~Optimizer.use_auto_new_epoch – Boolean flag to indicate if new_epoch() will be called by the updater. Updater should set this flag to True if it automatically calls new_epoch().

Methods

add_hook(hook, name=None, timing='auto')[source]

Registers a hook function.

Hook function is typically called right after the gradient computation, though the timing depends on the optimization method, and the timing attribute.

Parameters
  • hook (callable) – Hook function. If hook.call_for_each_param is true, this hook function is called for each parameter by passing the update rule and the parameter. Otherwise, this hook function is called only once each iteration by passing the optimizer.

  • name (str) – Name of the registration. If omitted, hook.name is used by default.

  • timing (str) – Specifies when the hook is called. If ‘auto’, the timimg property of the hook will decide the timing. If ‘pre’, the hook will be called before any updates. If ‘post’, the hook will be called after any updates.

call_hook(hook)[source]
call_hooks(timing='pre')[source]

Invokes hook functions in registration order.

check_nan_in_grads()[source]

Checks if there is NaN in grads when dynamic loss scaling used.

is_safe_to_update()[source]
loss_scaling(interval=1000, scale=None)[source]

Configures the loss scaling algorithm.

Parameters
  • interval (int) – Number of iterations until scaling factor gets doubled. This is effective when “dynamic” loss scaling is used.

  • scale (float) – Loss scaling factor. If None, “dynamic” loss scaling is used, otherwise “static” loss scaling is used.

new_epoch(auto=False)[source]

Starts a new epoch.

This method increments the epoch count. Note that if the optimizer depends on the epoch count, then user should call this method appropriately at the beginning of each epoch.

Parameters

auto (bool) – Should be True if this method is called by an updater. In this case, use_auto_new_epoch should be set to True by the updater.

remove_hook(name)[source]

Removes a hook function.

Parameters

name (str) – Registered name of the hook function to remove.

serialize(serializer)[source]

Serializes or deserializes the optimizer.

It only saves or loads the following things:

  • Optimizer states

  • Global states (t and epoch)

It does not saves nor loads the parameters of the target link. They should be separately saved or loaded.

Parameters

serializer (AbstractSerializer) – Serializer or deserializer object.

set_loss_scale(loss_scale)[source]

Sets loss scaling factor.

setup(link)[source]

Sets a target link and initializes the optimizer states.

Given link is set to the target attribute. It also prepares the optimizer state dictionaries corresponding to all parameters in the link hierarchy. The existing states are discarded.

Parameters

link (Link) – Target link object.

Returns

The optimizer instance.

Note

As of v4.0.0, this function returns the optimizer instance itself so that you can instantiate and setup the optimizer in one line, e.g., optimizer = SomeOptimizer().setup(link).

update(lossfun=None, *args, **kwds)[source]

Updates the parameters.

This method updates the parameters of the target link. The behavior of this method is different for the cases either lossfun is given or not.

If lossfun is given, this method typically clears the gradients, calls the loss function with given extra arguments, and calls the backward() method of its output to compute the gradients. The actual implementation might call lossfun more than once.

If lossfun is not given, then this method assumes that the gradients of all parameters are already computed. An implementation that requires multiple gradient computations might raise an error on this case.

In both cases, this method invokes the update procedure for all parameters.

Parameters
  • lossfun (callable) – Loss function. You can specify one of loss functions from built-in loss functions, or your own loss function. It should not be an loss functions with parameters (i.e., Link instance). The function must accept arbitrary arguments and return one Variable object that represents the loss (or objective) value. Returned value must be a Variable derived from the input Variable object. lossfun can be omitted for single gradient-based methods. In this case, this method assumes gradient arrays computed.

  • args – Arguments for the loss function.

  • kwds – Arguments for the loss function.

update_loss_scale()[source]
__eq__(value, /)

Return self==value.

__ne__(value, /)

Return self!=value.

__lt__(value, /)

Return self<value.

__le__(value, /)

Return self<=value.

__gt__(value, /)

Return self>value.

__ge__(value, /)

Return self>=value.

Attributes

epoch = 0
t = 0
target = None
use_auto_new_epoch = False