mxnet.npx.activation¶
-
activation
(data, act_type='relu', **kwargs)¶ Applies an activation function element-wise to the input.
The following activation functions are supported:
log_sigmoid: \(y = log(\frac{1}{1 + exp(-x)})\)
mish: \(y = x * tanh(log(1 + exp(x)))\)
relu: Rectified Linear Unit, \(y = max(x, 0)\)
sigmoid: \(y = \frac{1}{1 + exp(-x)}\)
tanh: Hyperbolic tangent, \(y = \frac{exp(x) - exp(-x)}{exp(x) + exp(-x)}\)
softrelu: Soft ReLU, or SoftPlus, \(y = log(1 + exp(x))\)
softsign: \(y = \frac{x}{1 + abs(x)}\)
Did this page help you?
Yes
No
Thanks for your feedback!