mxnet.npx.activation

activation(data, act_type='relu', **kwargs)

Applies an activation function element-wise to the input.

The following activation functions are supported:

  • log_sigmoid: \(y = log(\frac{1}{1 + exp(-x)})\)

  • mish: \(y = x * tanh(log(1 + exp(x)))\)

  • relu: Rectified Linear Unit, \(y = max(x, 0)\)

  • sigmoid: \(y = \frac{1}{1 + exp(-x)}\)

  • tanh: Hyperbolic tangent, \(y = \frac{exp(x) - exp(-x)}{exp(x) + exp(-x)}\)

  • softrelu: Soft ReLU, or SoftPlus, \(y = log(1 + exp(x))\)

  • softsign: \(y = \frac{x}{1 + abs(x)}\)

Parameters
  • data (NDArray) – The input array.

  • act_type ({'log_sigmoid', 'mish', 'relu', 'sigmoid', 'softrelu', 'softsign', 'tanh'}, required) – Activation function to be applied.

Returns

out – The output of this function.

Return type

NDArray or list of NDArrays