# chainer.functions.log_softmax¶

chainer.functions.log_softmax(x, axis=1)[source]

Channel-wise log-softmax function.

This function computes its logarithm of softmax along the second axis. Let $$c = (c_1, c_2, \dots, c_D)$$ be the slice of x along with the second axis. For each slice $$c$$, it computes the logarithm of the function $$f(c)$$ defined as

$f(c) = {\exp(c) \over \sum_{d} \exp(c_d)}.$

This method is theoretically equivalent to log(softmax(x)) but is more stable.

Note

log(softmax(x)) may cause underflow when x is too small, because softmax(x) may returns 0. log_softmax method is more stable.

Parameters: x (Variable or numpy.ndarray or cupy.ndarray) – Input variable. A $$n$$-dimensional ($$n \geq 2$$) float array. axis (int) – The axis along which the softmax is to be computed. Output variable. A $$n$$-dimensional ($$n \geq 2$$) float array, which is the same shape with x. Variable

Example

>>> x = np.array([[0, 1, 2], [0, 2, 4]], np.float32)
>>> x
array([[0., 1., 2.],
[0., 2., 4.]], dtype=float32)
>>> F.log_softmax(x).array
array([[-2.407606  , -1.4076059 , -0.4076059 ],
[-4.1429315 , -2.1429315 , -0.14293146]], dtype=float32)
>>> np.allclose(F.log_softmax(x).data, F.log(F.softmax(x)).data)
True