The Perl Toolchain Summit needs more sponsors. If your company depends on Perl, please support this very important event.

NAME

AI::NNFlex - A customisable neural network simulator

SYNOPSIS

 use AI::NNFlex;

 my $network = AI::NNFlex->new(config parameter=>value);

 $network->add_layer(nodes=>x,activationfunction=>'function');

 $network->init(); 



 use AI::NNFlex::Dataset;

 my $dataset = AI::NNFlex::Dataset->new([
                        [INPUTARRAY],[TARGETOUTPUT],
                        [INPUTARRAY],[TARGETOUTPUT]]);

 my $sqrError = 10;

 while ($sqrError >0.01)

 {

        $sqrError = $dataset->learn($network);

 }

 $network->lesion({'nodes'=>PROBABILITY,'connections'=>PROBABILITY});

 $network->dump_state(filename=>'badgers.wts');

 $network->load_state(filename=>'badgers.wts');

 my $outputsRef = $dataset->run($network);

DESCRIPTION

 AI::NNFlex is intended to be a highly flexible, modular NN framework.
 It's written entirely in native perl, so there are essentially no
 prereq's. The following modular divisions are made:

        * NNFlex.pm
                the core module. Contains methods to construct and
                lesion the network

        * feedforward.pm
                the network type module. Feedforward is the only type
                currently defined, but others may be created and
                imported at runtime

        * <learning>.pm
                the learning algorithm. Currently the options are
                        backprop - standard vanilla backprop
                        momentum - backprop with momentum


        * <activation>.pm
                node activation function. Currently the options are
                tanh, linear & sigmoid.

        * Dataset.pm
                methods for constructing a set of input/output data
                and applying to a network.

 The code should be simple enough to use for teaching
 purposes, but a simpler implementation of a simple backprop
 network is included in the example file bp.pl. This is
 derived from Phil Brierleys freely available java code
 at www.philbrierley.com.

 AI::NNFlex leans towards teaching NN and cognitive modelling
 applications. Future modules are likely to include more
 biologically plausible nets like DeVries & Principes
 Gamma model.

CONSTRUCTOR

AI::NNFlex

 new ( parameter => value );
        
        randomweights=>MAXIMUM VALUE FOR INITIAL WEIGHT

        learningalgorithm=>The AI::NNFlex module to import for
                training the net

        networktype=>The AI::NNFlex module to import for flowing
                activation

        debug=>[LIST OF CODES FOR MODULES TO DEBUG]

        learningrate=>the learning rate of the network

        momentum=>the momentum value (momentum learning only)

AI::NNFlex::Dataset

 new ( [[INPUT VALUES],[OUTPUT VALUES],[INPUT VALUES],[OUTPUT VALUES],..])

INPUT VALUES

 These should be comma separated values. They can be applied
 to the network with ::run or ::learn

OUTPUT VALUES

 These are the intended or target output values. Comma separated
 These will be used by ::learn

METHODS

 This is a short list of the main methods. For details on all
 available methods, please see individual pod pages below, and
 in individual imported modules.

AI::NNFlex

add_layer

 Syntax:

 $network->add_layer(   nodes=>NUMBER OF NODES IN LAYER,
                        persistentactivation=>RETAIN ACTIVATION BETWEEN PASSES,
                        decay=>RATE OF ACTIVATION DECAY PER PASS,
                        randomactivation=>MAXIMUM STARTING ACTIVATION,
                        threshold=>NYI,
                        activationfunction=>"ACTIVATION FUNCTION",
                        randomweights=>MAX VALUE OF STARTING WEIGHTS);

init

 Syntax:

 $network->init();

 Initialises connections between nodes, sets initial weights and
 loads external components

lesion

 $network->lesion ({'nodes'=>PROBABILITY,'connections'=>PROBABILITY})

 Damages the network.

 PROBABILITY

 A value between 0 and 1, denoting the probability of a given node
 or connection being damaged.

 Note: this method may be called on a per network, per node or per
 layer basis using the appropriate object.

AN::NNFlex::Dataset

learn

 $dataset->learn($network)

 'Teaches' the network the dataset using the networks defined learning
 algorithm.
 Returns sqrError;

run

 $dataset->run($network)

 Runs the dataset through the network and returns a reference to an array of
 output patterns.

EXAMPLES

 See the code in ./examples. For any given version of NNFlex, xor.pl will
 contain the latest functionality.

PREREQs

 None. NNFlex should run OK on any version of Perl 5 >. 

ACKNOWLEDGEMENTS

 Phil Brierley, for his excellent free java code, that solved my backprop
 problem

 Dr Martin Le Voi, for help with concepts of NN in the early stages

 Dr David Plaut, for help with the project that this code was originally
 intended for.

 Graciliano M.Passos for suggestions & improved code (see SEE ALSO).

SEE ALSO

 AI::NNEasy - Developed by Graciliano M.Passos 
 Shares some common code with NNFlex. Much faster, and more suitable for
 backprop projects with large datasets.

TODO

 Lots of things:

 clean up the perldocs some more
 write gamma modules
 write BPTT modules
 write a perceptron learning module
 speed it up
 write a tk gui

CHANGES

 v0.11 introduces the lesion method, png support in the draw module
  and datasets.
 v0.12 fixes a bug in reinforce.pm & adds a reflector in feedforward->run
  to make $network->run($dataset) work.
 v0.13 introduces the momentum learning algorithm and fixes a bug that
  allowed training to proceed even if the node activation function module
  can't be loaded
 v0.14 fixes momentum and backprop so they are no longer nailed to tanh hidden
  units only.
 v0.15 fixes a bug in feedforward, and reduces the debug overhead
 v0.16 changes some underlying addressing of weights, to simplify and speed  

COPYRIGHT

 Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program
 is free software; you can redistribute it and/or modify it under the same
 terms as Perl itself.

CONTACT

 charlesc@nnflex.g0n.net