NAME
AI::ActivationFunctions - Activation functions for neural networks in Perl
VERSION
Version 0.01
ABSTRACT
Activation functions for neural networks in Perl
SYNOPSIS
use AI::ActivationFunctions qw(relu prelu sigmoid);
my $result = relu(-5); # returns 0
my $prelu_result = prelu(-2, 0.1); # returns -0.2
# Array version works too
my $array_result = relu([-2, -1, 0, 1, 2]); # returns [0, 0, 0, 1, 2]
DESCRIPTION
This module provides various activation functions commonly used in neural networks and machine learning. It includes basic functions like ReLU and sigmoid, as well as advanced functions like GELU and Swish.
FUNCTIONS
Basic Functions
relu($input)
Rectified Linear Unit. Returns max(0, $input).
prelu($input, $alpha=0.01)
Parametric ReLU. Returns $input if $input > 0, else $alpha * $input.
leaky_relu($input)
Leaky ReLU with alpha=0.01.
sigmoid($input)
Sigmoid function: 1 / (1 + exp(-$input)).
tanh($input)
Hyperbolic tangent function.
softmax(\@array)
Softmax function for probability distributions.
Advanced Functions
elu($input, $alpha=1.0)
Exponential Linear Unit.
swish($input)
Swish activation function.
gelu($input)
Gaussian Error Linear Unit (used in transformers like BERT, GPT).
Derivatives
relu_derivative($input)
Derivative of ReLU for backpropagation.
sigmoid_derivative($input)
Derivative of sigmoid for backpropagation.
EXPORT
By default nothing is exported. You can export specific functions:
use AI::ActivationFunctions qw(relu prelu); # specific functions
use AI::ActivationFunctions qw(:basic); # basic functions
use AI::ActivationFunctions qw(:all); # all functions
SEE ALSO
PDL - Perl Data Language for numerical computing
AI::TensorFlow - Perl interface to TensorFlow
AI::MXNet - Perl interface to Apache MXNet
AUTHOR
Your Name <your.email@example.com>
LICENSE
This library is free software; you can redistribute it and/or modify it under the same terms as Perl itself.