Search results for "module:AI::NeuralNet::Simple"

AI::NeuralNet::Simple - An easy to use backprop neural net. River stage zero No dependents

The Disclaimer Please note that the following information is terribly incomplete. That's deliberate. Anyone familiar with neural networks is going to laugh themselves silly at how simplistic the following information is and the astute reader will not...

OVID/AI-NeuralNet-Simple-0.11 - 18 Nov 2006 15:53:01 UTC

AI::PSO - Module for running the Particle Swarm Optimization algorithm River stage zero No dependents

OF ALGORITHM Particle Swarm Optimization is an optimization algorithm designed by Russell Eberhart and James Kennedy from Purdue University. The algorithm itself is based off of the emergent behavior among societal groups ranging from marching of ant...

KYLESCH/AI-PSO-0.86 - 25 Nov 2006 03:49:50 UTC

AI::NNEasy - Define, learn and use easy Neural Networks of different types using a portable code in Perl and XS. River stage zero No dependents

The main purpose of this module is to create easy Neural Networks with Perl. The module was designed to can be extended to multiple network types, learning algorithms and activation functions. This architecture was 1st based in the module AI::NNFlex,...

GMPASSOS/AI-NNEasy-0.06 - 17 Jan 2005 02:25:07 UTC

AI::NeuralNet::Hopfield - A simple Hopfiled Network Implementation. River stage zero No dependents

LEPREVOST/AI-NeuralNet-Hopfield-0.1 - 05 Mar 2013 04:27:59 UTC

AI::NeuralNet::Mesh - An optimized, accurate neural network Mesh. River stage zero No dependents

AI::NeuralNet::Mesh is an optimized, accurate neural network Mesh. It was designed with accruacy and speed in mind. This network model is very flexable. It will allow for clasic binary operation or any range of integer or floating-point inputs you ca...

JBRYAN/AI-NeuralNet-Mesh-0.44 - 14 Sep 2000 20:56:21 UTC

AI::NeuralNet::BackProp - A simple back-prop neural net that uses Delta's and Hebbs' rule. River stage zero No dependents

AI::NeuralNet::BackProp implements a nerual network similar to a feed-foward, back-propagtion network; learning via a mix of a generalization of the Delta rule and a disection of Hebbs rule. The actual neruons of the network are implemented via the A...

JBRYAN/AI-NeuralNet-BackProp-0.89 - 17 Aug 2000 07:21:47 UTC
6 results (0.029 seconds)