chainer.force_backprop_mode

chainer.force_backprop_mode()[source]

Make a context manager which enables back-propagation.

When you want to enable back-propagation in no_backprop_mode(), call this method. A Variable created in this context always has a computational graph unless overridden by deeper contexts. If you call this method outside of no_backprop_mode() context, it changes nothing.

In the following example, y has a computational graph and calling backward() on y will compute and accumulate the gradients of the variables in the graph, in this case only x.

>>> x = chainer.Variable(np.array([1,], np.float32))
>>> with chainer.no_backprop_mode():
...     with chainer.force_backprop_mode():
...         y = x + 1
>>> y.backward()
>>> x.grad
array([1.], dtype=float32)

Note

chainer.force_backprop_mode() implicitly applies ChainerX’s counterpart chainerx.force_backprop_mode(), but not vice versa. Also, setting enable_backprop configuration does not affect ChainerX.

See also

See chainer.no_backprop_mode() for details on disabled back-propagation mode.