chainer.training.ParallelUpdater¶
-
class
chainer.training.
ParallelUpdater
(iterator, optimizer, converter=<function concat_examples>, models=None, devices=None, loss_func=None)[source]¶ Implementation of a parallel GPU Updater.
This is an implementation of
Updater
that uses multiple GPUs. It behaves similarly toStandardUpdater
. The update routine is modified to support data-parallel computation on multiple GPUs in one machine. It is based on synchronous parallel SGD: it parallelizes the gradient computation over a mini-batch, and updates the parameters only in the main device.Parameters: - iterator – Dataset iterator for the training dataset. It can also be a
dictionary that maps strings to iterators.
If this is just an iterator, then the
iterator is registered by the name
'main'
. - optimizer – Optimizer to update parameters. It can also be a dictionary
that maps strings to optimizers.
If this is just an optimizer, then the optimizer is
registered by the name
'main'
. - converter – Converter function to build input arrays. Each batch
extracted by the main iterator is split equally between the
devices and then passed with corresponding
device
option to this function.concat_examples()
is used by default. - models – Dictionary of models. The main model should be the same model
attached to the
'main'
optimizer. - devices – Dictionary of devices to which the training data is sent. The
devices should be arranged in a dictionary with the same structure
as
models
. - loss_func – Loss function. The model is used as a loss function by default.
Methods
-
finalize
()[source]¶ Finalizes the updater object.
This method calls the finalize method of each iterator that this updater has. It is called at the end of training loops.
-
get_all_optimizers
()[source]¶ Gets a dictionary of all optimizers for this updater.
Returns: Dictionary that maps names to optimizers. Return type: dict
-
get_iterator
(name)[source]¶ Gets the dataset iterator of given name.
Parameters: name (str) – Name of the dataset iterator. Returns: Corresponding dataset iterator. Return type: Iterator
-
get_optimizer
(name)[source]¶ Gets the optimizer of given name.
Parameters: name (str) – Name of the optimizer. Returns: Corresponding optimizer. Return type: Optimizer
- iterator – Dataset iterator for the training dataset. It can also be a
dictionary that maps strings to iterators.
If this is just an iterator, then the
iterator is registered by the name