http://www.perlmonks.org?node_id=1217641

A small implementation of an Artificial Neural Network using Hopfield neurons, synapses and a simple training system :
unit module ann; use ann::HopfieldSynaps; class HopfieldNeuron is export { has @.inputsynapses; has @.outputsynapses; has $.input; method BUILD($y1 = 1000000.rand) { $.input = $y1; } method fire() { ### with training update weights loop (my $i = 0; $i < @.inputsynapses.length; $i++) { if (@.inputsynapses[$i].weight * @.inputsynaps +es[$i].outputneuron.input >= 0) { @.inputsynapses[$i].outputneuron.input + = 1; } else { @.inputsynapses[$i].outputneuron.input + = 0; } } } }
unit module ann; use ann::HopfieldNeuron; class HopfieldSynaps is export { has $.weight; has $.inputneuron; has $.outputneuron; method BUILD($inputneuron, $outputneuron, $y1 = 1000000.rand) +{ $.weight = $y1; } };
unit module ann; use ann::HopfieldNeuron; use ann::HopfieldSynaps; class HopfieldNN is export { has @.neurons; method BUILD($size) { @.neurons = (); loop (my $n = 0; $n < $size; $n++) { push (@.neurons, HopfieldNeuron.new()); } loop (my $m = 0; $m < $size; $m++) { loop (my $j = 0; $j < $size; $j++) { push(@.neurons[$j].inputsynapses, Hopf +ieldSynaps.new()); @.neurons[$j].inputsynapses[$j].output +neuron = @.neurons[$m]; } } loop (my $i = 0; $i < $size; $i++) { loop (my $j = 0; $j < $size; $j++) { push(@.neurons[$j].outputsynapses, Hop +fieldSynaps.new()); @.neurons[$j].outputsynapses[$j].outpu +tneuron = @.neurons[$i]; } } } ### repeat this to train the network method start(@inputs) { ### the inputs length is less than the full neuron lis +t ### the first neurons made in the constructor are the +inputs ### of the network loop (my $i = 0; $i < @inputs.length; $i++) { @.neurons[$i].input = @inputs[$i]; } loop (my $j = 0; $j < @.neurons.length; $j++) { @.neurons[$j].fire(); } }
method start2(@inputs) { ### without any traning, first neurons are for the inp +ut pattern loop (my $n = 0; $n < @inputs.length; $n++) { @.neurons[$n].input = @inputs[$n]; } loop (my $i = 0; $i < @.neurons.length; $i++) { loop (my $j = 0; $j < @.neurons.length; $j++) +{ loop (my $k = 0; $k < @.neurons.length +; $k++) { if ($k == $j) { next; }; @.neurons[$i].inputsynapses[$j].weight + += (2 * @.neurons[$i].inputsynapses[$j].outputneuron.input - 1) * (2 + * @.neurons[$i].inputsynapes[$k].outputneuron.input -1); } } } } };

Replies are listed 'Best First'.
Re: Hopfield Neural Network in perl6
by liz (Monsignor) on Jun 30, 2018 at 07:45 UTC

    This looks pretty cool!

    If you allow me to make some remarks: it looks like a very C-ish/Perl 5-ish piece of Perl 6 code.

    And I think there is one error in it:

    method BUILD($inputneuron, $outputneuron, $y1 = 1000000.rand) +{ $.weight = $y1; }

    The $.weight is equivalent to self.weight. You can only use the accessor as a mutator if the attribute is marked is rw. So either you can change the assignment to $!weight = $y1 (directly accessing the attribute), or you can add is rw to the attribute declaration: has $.weight is rw. The choice is really whether you want the weight to be assignable from the outside or not and/or you want subclasses to be able to define their own "weight" method or not.

    With regards to C-ism / Perl 5-isms: it feels to me that the following loop

    loop (my $i = 0; $i < @.inputsynapses.length; $i++) { if (@.inputsynapses[$i].weight * @.inputsynapses[$i].outputneuron. +input >= 0) { @.inputsynapses[$i].outputneuron.input = 1; } else { @.inputsynapses[$i].outputneuron.input = 0; } }

    could be simplified to:

    .outputneuron.input = +(.weight * .outputneuron.input >= 0) for @.inputsynapses;

    This will iterate over all of the inputsynapses without needing to index. The +( ) is what transforms a Boolean True / False to 1 or 0.

    Hope this helps