The Perl Toolchain Summit needs more sponsors. If your company depends on Perl, please support this very important event.

    AI::MXNet::Gluon::NN::Activation

DESCRIPTION

    Applies an activation function to input.

    Parameters
    ----------
    activation : str
        Name of activation function to use.
        See mxnet.ndarray.Activation for available choices.

    Input shape:
        Arbitrary.

    Output shape:
        Same shape as input.

    AI::MXNet::Gluon::NN::LeakyReLU - Leaky version of a Rectified Linear Unit.

DESCRIPTION

    Leaky version of a Rectified Linear Unit.

    It allows a small gradient when the unit is not active

    Parameters
    ----------
    alpha : float
        slope coefficient for the negative half axis. Must be >= 0.

    AI::MXNet::Gluon::NN::PReLU - Parametric leaky version of a Rectified Linear Unit.

DESCRIPTION

    Parametric leaky version of a Rectified Linear Unit.
    https://arxiv.org/abs/1502.01852

    It learns a gradient when the unit is not active

    Parameters
    ----------
    alpha_initializer : Initializer
        Initializer for the embeddings matrix.

    AI::MXNet::Gluon::NN::ELU - Exponential Linear Unit (ELU)

DESCRIPTION

    Exponential Linear Unit (ELU)
        "Fast and Accurate Deep Network Learning by Exponential Linear Units", Clevert et al, 2016
        https://arxiv.org/abs/1511.07289
        Published as a conference paper at ICLR 2016

    Parameters
    ----------
    alpha : float
        The alpha parameter as described by Clevert et al, 2016

    AI::MXNet::Gluon::NN::SELU - Scaled Exponential Linear Unit (SELU)

DESCRIPTION

    Scaled Exponential Linear Unit (SELU)
    "Self-Normalizing Neural Networks", Klambauer et al, 2017
    https://arxiv.org/abs/1706.02515

    AI::MXNet::Gluon::NN::Swish - Swish Activation function

DESCRIPTION

    Swish Activation function
        https://arxiv.org/pdf/1710.05941.pdf

    Parameters
    ----------
    beta : float
        swish(x) = x * sigmoid(beta*x)