The London Perl and Raku Workshop takes place on 26th Oct 2024. If your company depends on Perl, please consider sponsoring and/or attending.

NAME

Neural::Basic - Copylefted artificial neural networks extension

DESCRIPTION

The Neural module enables access to the GANN artificial neural network library designed especially to work with Perl. This makes it possible to do high-level neural network operations from inside Perl. The interface consists of several classes of Perl objects, described below.

For information about the design, see Math::Gann

CLASSES

Neural::Pars

Neural::Pars is a convenience class that holds suitably sized vectors for a given network. The most important user-accessible members are 'Wei' and 'DWei', which give the weights and the gradient of the error w.r.t. the weights (when doing backprop), respectively. The other members are usually not important for the user.

Methods:

new(net)

Net must be a reference to a valid Neural::Net.

rand_weights(min,max)

Randomize all the weights in the network. Min and max are optional arguments, defaulting to -0.8 and 0.8.

asstr

Return a string representation of this structure. Useful for debugging.

Neural::Net

Neural::Net is the class that does most of the actual work: it contains the network topology and is used to do forward and backpropagation.

Before using, the network topology has to be constructed by creating all the groups of neurons one by one, specifying the connections of each to each other.

Methods:

new(ninputs)

Creates a new, empty net

new_load(file,args)

Loads a network from a file. The file is actually perl code to create the network and args are given verbatim to the function define_network in the file, which should return one reference to the network.

activate(in,unitin,unitout,out,wei) =item activate_p(in,out,pars)

Does forward propagation through the network. in is the inputs vector, and out gives the result

backactivate(in,unitin,unitout,out,wei,dwei,dunitout,target)

Does back-propagation through the network. target is the output we would have liked to have and dwei gives the gradient on the error surface w.r.t. the weights.

backprop(in,unitin,unitout,out,wei,dwei,dunitout,target)

Just a wrapper that calls both of the preceding functions and does some error checking on the sizes of the vectors.

inputsize(), outputsize(), nunits(), nweights(), ngroups()

Return numbers of some entities in the network.

Returns a string describing the network's structure in plain text. Useful for debugging, mostly.

The current implementation is still a little rough, requiring the call to finalize() in order to get a work.

Neural::Examples

Neural::Examples stores a set of example patterns and can be used to evaluate the error on a network based on them.

Methods:

new(inputsize, outputsize)

Returns a new example set.

addexample(in,out)

Adds a new example. The parameters are references to arrays.

nexamples()

Returns the number of examples in the net.

get_example(n)

Returns a list of three Math::DVector objects that give the input, target and last output of the example, respectively. Both of these are modifiable!!

evalfun(pars)

Evaluate the network with the parameters pars, looping through the examples. Adds to the element 'DWei' of pars.

Neural::Minimizer

Neural::Minimizer completes the list of components for a simple backpropagation neural net simulation. The minimizer is completely generic, being given as argument only the current parameters (weights), value of the function to be minimized and the gradient. It then outputs the new parameters to be tried. The minimizer consists of two objects, one that handles the connection to the neural network and one that minimizes an arbitrary vector.

The interface to the Neural::Min_* classes is simple: the following functions must be defined

set_net(net)

Set the network whose error is to be minimized to net.

go

Start the minimization.

set_direction(obj)

Set the direction method to obj. The object must define the method get_dir(w,f,dw,pdir) where w are the current weights, f is the function value, dw is the gradient vector and pdir is the previous search direction. The method should return the direction by modifying the vector pdir.

set_initstep(obj)

Set the initial step method to obj. The object must have a method get_istep(w,f,dw,dir,pstep). The method should return a value indicating success or failure. pstep is the previous step value, return should modify this.

set_step(obj)

Set the step method to obj. The method is get_step(w,f,dw,dir,istep,step). The step should be returned in step. For line search methods, the interface should be expanded a little to allow evaluation of some points on the line.

set_conv(obj)

Set the convergence test method to obj. The method is get_conv(w,f,dw,pf) where pf is the previous function value.

set_change_fn(obj,n)

Every n rounds call obj-\change_ex(pars)>. This function may alter something in the example set, for example in order to do temporal difference learning.

reset

Reset the minimizer.

The direction, initstep and step objects must also have the following methods:

alloc(state,n)

Allocate the internal state vectors to contain room for n parameters.

reset(state,minimizer)

Reset the internal state held by the minimizer. The Minimizer argument is given so that the Min can query it for number of parameters etc.

get_paramlist

Return the hash table Parameter -> [value,help]

set_param(param,val)

Direction methods.

Neural::Dir_Steepest

Steepest descent. No parameters.

Neural::Dir_Momentum

Momentum descent. One parameter, alpha, which designates the attenuation of the momentum.

Initial step methods.

Neural::IStep_Fixed

Fixed step of length epsilon.

Step size methods.

Neural::Step_None

Do nothing (just take a step of the size given by the initial step).

Convergence methods.

Neural::Conv_Accept

Converge when function low enough (if you know that you can fit the data, use this)

BUGS

Lots of them.

AUTHOR

Tuomas J. Lukka (Tuomas.Lukka@Helsinki.FI)

1 POD Error

The following errors were encountered while parsing the POD:

Around line 428:

You forgot a '=back' before '=head2'