The Perl Toolchain Summit needs more sponsors. If your company depends on Perl, please support this very important event.

NAME

AI::LibNeural - Perl extension libneural

SYNOPSIS

  use AI::LibNeural;

  my $nn = AI::LibNeural->new( 2, 4, 1 );

  # teach it the logical AND
  $nn->train( [ 0, 0 ], [ 0.05 ], 0.0000000005, 0.2 );
  $nn->train( [ 0, 1 ], [ 0.05 ], 0.0000000005, 0.2 );
  $nn->train( [ 1, 0 ], [ 0.05 ], 0.0000000005, 0.2 );
  $nn->train( [ 1, 1 ], [ 0.95 ], 0.0000000005, 0.2 );

  my $result = $nn->run( [ 1, 1 ] );
  # result should be ~ 0.95
  $result = $nn->run( [ 0, 1 ] );
  # result should be ~ 0.05

  $nn->save('and.mem');

ABSTRACT

  Perl bindings for the libneural c++ neural netowrk library.

DESCRIPTION

Provides accessors for the libneural library as a perl object. libneural is a C++ library that impelements a feed-forward back-proprogation neural network. The interface is extremely simple and should take no more than a few minutes to master given a reasonable knowledge of back proprogation neural networks.

FUNCTIONS

$nn = AI:LibNeural->new()

Creates an empty AI::LibNeural object, should only be used when the load method will be called soon after.

$nn = AI::LibNeural->new(FILENAME)

Creates a new AI::LibNeural object from the supplied memory file.

$nn = AI::LibNeural->new(INTPUTS,HIDDENS,OUTPUTS)

Creates a new AI::LibNeural object with INPUTS input nodes, HIDDENS hidden nodes, and OUTPUTS output nodes.

$nn->train([I1,I2,...],[O1,O2,...],MINERR,TRAINRATE)

Completes a training cycle for the given inputs I1-IN, with the expected results of O1-OM, where N is the number of inputs and M is the number of outputs. MINERR is the mean squared error at the output that you wish to be achieved. TRAINRATE is the learning rate to be used.

(O1,O2) = $nn->run([I1,I2,...])

Calculate the corresponding outputs (O1-OM) for the given inputs (I1-ON) based on the previous training. Should only be called after the network has been suitably trained.

NUM = $nn->get_layersize(WHICH)

Retrieves the number of nodes at the specified layer, WHICH. WHICH should be one of ALL, INPUT, HIDDEN, OUTPUT. Usefully mainly with a network is loaded from a file.

status = $nn->load(FILENAME)
status = $nn->save(FILENAME)

Loads and saves respectively the 'memory,' node configuration and weights, of the network. FILENAME should be the location of the file in which the memory is stored/retrieved.

EXPORT

None by default

EXPORT TAGS

all
ALL
  The total number of nodes on all three layers
INPUT
  The number of nodes on the input layer
HIDDEN
  The number of nodes on the hidden layer
OUTPUT
  The number of nodes on the output layer

AUTHOR

Ross McFarland <rmcfarla at neces dot com>

SEE ALSO

perl. libneural documentation

LICENSE

this is based off of code that i based off of other modules i've found in the distant past. if you are the original author and you recognize this code let me know and you'll be credited

Copyright (C) 2003 by Ross McFarland

This library is free software; you can redistribute it and/or modify it under the terms of the GNU Library General Public License as published by the Free Software Foundation; either version 2 of the License, or (at your option) any later version.

This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Library General Public License for more details.

You should have received a copy of the GNU Library General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA 02111-1307 USA.