NAME
AI::NNFlex::Reinforce - A very simple experimental NN module
SYNOPSIS
use AI::NNFlex::Reinforce;
my $network = AI::NNFlex::Reinforce->new(config parameter=>value);
$network->add_layer(nodes=>x,activationfunction=>'function');
$network->init();
use AI::NNFlex::Dataset;
my $dataset = AI::NNFlex::Dataset->new([
[INPUTARRAY],[TARGETOUTPUT],
[INPUTARRAY],[TARGETOUTPUT]]);
my $sqrError = 10;
for (1..100)
{
$dataset->learn($network);
}
$network->lesion({'nodes'=>PROBABILITY,'connections'=>PROBABILITY});
$network->dump_state(filename=>'badgers.wts');
$network->load_state(filename=>'badgers.wts');
my $outputsRef = $dataset->run($network);
my $outputsRef = $network->output(layer=>2,round=>1);
DESCRIPTION
Reinforce is a very simple NN module. It's mainly included in this distribution to provide an example of how to subclass AI::NNFlex to write your own NN modules. The training method strengthens any connections that are active during the run pass.
CONSTRUCTOR
AI::NNFlex::Reinforce
new ( parameter => value );
randomweights=>MAXIMUM VALUE FOR INITIAL WEIGHT
fixedweights=>WEIGHT TO USE FOR ALL CONNECTIONS
debug=>[LIST OF CODES FOR MODULES TO DEBUG]
learningrate=>the learning rate of the network
round=>0 or 1 - 1 sets the network to round output values to
nearest of 1, -1 or 0
The following parameters are optional: randomweights fixedweights debug round
(Note, if randomweights is not specified the network will default to a random value from 0 to 1.
METHODS
This is a short list of the main methods implemented in AI::NNFlex. Subclasses may implement other methods.
AI::NNFlex
add_layer
Syntax:
$network->add_layer( nodes=>NUMBER OF NODES IN LAYER,
persistentactivation=>RETAIN ACTIVATION BETWEEN PASSES,
decay=>RATE OF ACTIVATION DECAY PER PASS,
randomactivation=>MAXIMUM STARTING ACTIVATION,
threshold=>NYI,
activationfunction=>"ACTIVATION FUNCTION",
randomweights=>MAX VALUE OF STARTING WEIGHTS);
init
Syntax:
$network->init();
Initialises connections between nodes, sets initial weights and loads external components. The base AI::NNFlex init method implementes connections backwards and forwards from each node in each layer to each node in the preceeding and following layers.
lesion
$network->lesion ({'nodes'=>PROBABILITY,'connections'=>PROBABILITY})
Damages the network.
PROBABILITY
A value between 0 and 1, denoting the probability of a given node or connection being damaged.
Note: this method may be called on a per network, per node or per layer basis using the appropriate object.
AN::NNFlex::Dataset
learn
$dataset->learn($network)
'Teaches' the network the dataset using the networks defined learning algorithm. Returns sqrError;
run
$dataset->run($network)
Runs the dataset through the network and returns a reference to an array of output patterns.
EXAMPLES
See the code in ./examples. For any given version of NNFlex, xor.pl will contain the latest functionality.
PREREQs
None. NNFlex::Reinforce should run OK on any version of Perl 5 >.
ACKNOWLEDGEMENTS
Phil Brierley, for his excellent free java code, that solved my backprop problem
Dr Martin Le Voi, for help with concepts of NN in the early stages
Dr David Plaut, for help with the project that this code was originally intended for.
Graciliano M.Passos for suggestions & improved code (see SEE ALSO).
Dr Scott Fahlman, whose very readable paper 'An empirical study of learning speed in backpropagation networks' (1988) has driven many of the improvements made so far.
SEE ALSO
AI::NNFlex
AI::NNFlex::Backprop
AI::NNFlex::Dataset
COPYRIGHT
Copyright (c) 2004-2005 Charles Colbourn. All rights reserved. This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.
CONTACT
charlesc@nnflex.g0n.net