chainer.optimizer_hooks.GradientNoise

class chainer.optimizer_hooks.GradientNoise(eta, noise_func=<function exponential_decay_noise>)[source]

Optimizer/UpdateRule hook function for adding gradient noise.

This hook function simply adds noise generated by the noise_func to the gradient. By default it adds time-dependent annealed Gaussian noise to the gradient at every training step:

\[g_t \leftarrow g_t + N(0, \sigma_t^2)\]

where

\[\sigma_t^2 = \frac{\eta}{(1+t)^\gamma}\]

with \(\eta\) selected from {0.01, 0.3, 1.0} and \(\gamma = 0.55\).

Parameters
Variables
  • ~optimizer_hooks.GradientNoise.timing (string) – Specifies when this hook should be called by the Optimizer/UpdateRule. Valid values are ‘pre’ (before any updates) and ‘post’ (after any updates).

  • ~optimizer_hooks.GradientNoise.call_for_each_param (bool) – Specifies if this hook is called for each parameter (True) or only once (False) by an optimizer to which this hook is registered. This function does not expect users to switch the value from default one, which is True.

New in version 4.0.0: The timing parameter.

Methods

__call__(rule, param)[source]

Call self as a function.

__eq__(value, /)

Return self==value.

__ne__(value, /)

Return self!=value.

__lt__(value, /)

Return self<value.

__le__(value, /)

Return self<=value.

__gt__(value, /)

Return self>value.

__ge__(value, /)

Return self>=value.

Attributes

call_for_each_param = True
name = 'GradientNoise'
timing = 'pre'