mx.symbol.Activation
¶
Description¶
Applies an activation function element-wise to the input.
The following activation functions are supported:
relu: Rectified Linear Unit, \(y = max(x, 0)\)
sigmoid: \(y = \frac{1}{1 + exp(-x)}\)
tanh: Hyperbolic tangent, \(y = \frac{exp(x) - exp(-x)}{exp(x) + exp(-x)}\)
softrelu: Soft ReLU, or SoftPlus, \(y = log(1 + exp(x))\)
softsign: \(y = \frac{x}{1 + abs(x)}\)
Usage¶
mx.symbol.Activation(...)
Arguments¶
Argument |
Description |
---|---|
|
NDArray-or-Symbol. The input array. |
|
{‘relu’, ‘sigmoid’, ‘softrelu’, ‘softsign’, ‘tanh’}, required. Activation function to be applied. |
|
string, optional. Name of the resulting symbol. |
Value¶
out
The result mx.symbol
Link to Source Code: http://github.com/apache/incubator-mxnet/blob/1.6.0/src/operator/nn/activation.cc#L168