Optimizers¶
-
class
chainer.optimizers.
AdaDelta
(rho=0.95, eps=1e-06)[source]¶ Zeiler’s ADADELTA.
See: http://www.matthewzeiler.com/pubs/googleTR2012/googleTR2012.pdf
-
class
chainer.optimizers.
Adam
(alpha=0.001, beta1=0.9, beta2=0.999, eps=1e-08)[source]¶ Adam optimization algorithm.
-
class
chainer.optimizers.
NesterovAG
(lr=0.01, momentum=0.9)[source]¶ Nesterov’s Accelerated Gradient.
Formulated as the linear combination coefficients of the velocity and gradient contributions at each iteration.
-
class
chainer.optimizers.
RMSpropGraves
(lr=0.0001, alpha=0.95, momentum=0.9, eps=0.0001)[source]¶ Alex Graves’s RMSprop.