Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris
 
PerlMonks  

AI::NNEasy to setup fast a Neural Network using just Perl and XS.

by gmpassos (Priest)
on Jan 14, 2005 at 23:42 UTC ( [id://422423]=perlmeditation: print w/replies, xml ) Need Help??

Felow monks,

Some days ago I posted a node asking for advices on Neural Networks with perl. The feedback didn't give me new ideas, so I started to setup my own NN. Now I want to share my experience here in the monastery:

First I have used the new module AI::NNFlex, that was the only one that works fine on Win32 and worked well with a real job. But testing other modules like AI::NeuralNet::Simple I saw that an easier interface and speed was needed.

So I started to write AI::NNEasy (http://search.cpan.org/~gmpassos/AI-NNEasy-0.03/). To do that I got the sources of AI::NNFlex, that are pure Perl, and rewrited them making some optimizations, specially in the access of the HASH keys, since is a OO module with a lot of access to attributes in many objects. Also I have fixed some node references that was using the object reference address as ID to identify the node bojects, what make impossible to serialize that, since the object address changes for each process execution.

To write all of that fast and add XS support I have used Class::HPLOO, that enables this kind of syntax:

class Foo { ## Object initializer: sub Foo (%args) { $this->{bar} = $arg{baz} ; } ## a Perl function: sub bar($x , $y) { $this->add($x , $y) ; } ## a C function that will be turned into XS: sub[C] int add( int x , int y ) { int res = x + y ; return res ; } }
The code above show how easy is to setup a class in Perl, and as a plus we can write C functions as we write normal Perl subs directly in the class body.

After rewrite all the NN code I started to analyze the methods that use more CPU using Devel::DProf. With that I found the subs that need to be turned into XS:

AI::NNEasy::NN::tanh AI::NNEasy::NN::feedforward::run AI::NNEasy::NN::backprop::hiddenToOutput AI::NNEasy::NN::backprop::hiddenOrInputToHidden
For example, the tanh function is called more than 30000 when the NN is learning a set of inputs, so I have writed 2 versions of the same function in the class:
class AI::NNEasy::NN[0.01] { ... *tanh = \&tanh_c ; sub tanh_pl ($value) { if ($value > 20) { return 1 ;} elsif ($value < -20) { return -1 ;} else { my $x = exp($value) ; my $y = exp(-$value) ; return ($x-$y)/($x+$y) ; } } sub[C] double tanh_c ( SV* self , double value ) { if ( value > 20 ) { return 1 ;} else if ( value < -20 ) { return -1 ;} else { double x = Perl_exp(value) ; double y = Perl_exp(-value) ; double ret = (x-y)/(x+y) ; return ret ; } } ... }
Finaly, after have made the NN work 10 times faster with this XS functions, I have added a more intuitive interface with the module and a winner algorithm to give to us the right output, and not just a decimal number near the real number of the output.

So, now we can set a NN with a simple OO interface, without need to write our won learning algorithms and output analyzer. Also we don't need to care (but can) about the hidden layers, since NNEasy will calculate them for you, you just need to paste the number of inputs and outputs:

use AI::NNEasy ; my $nn = AI::NNEasy->new( 'xor.nne' , ## file to save the NN. [0,1] , ## Output types of the NN. 0.1 , ## Maximal error for output. 2 , ## Number of inputs. 1 , ## Number of outputs. ) ; ## Our set of inputs and outputs to learn: my @set = ( [0,0] => [0], [0,1] => [1], [1,0] => [1], [1,1] => [0], ); ## learn the inputs: $nn->learn_set( \@set ) ; ## Save the NN: $nn->save ; ## Use the NN: my $out = $nn->run_get_winner([0,0]) ; print "0 0 => @$out\n" ; ## 0 0 => 0 my $out = $nn->run_get_winner([0,1]) ; print "0 1 => @$out\n" ; ## 0 1 => 1

Now I can work in the real project that made me research about NN. The project will be used to analyze texts on complex photos and identify them automatically. Now I have 60% of accurance, what is already a good result, since I have started to work in the NN part only in this week.

So, thanks to CPAN, to Inline::C, to AI::NNFlex, and all the resources that are there for free and open to be changed and improved, and now AI::NNEasy is there too. ;-P

Graciliano M. P.
"Creativity is the expression of liberty".

Replies are listed 'Best First'.
Re: AI::NNEasy to setup fast a Neural Network using just Perl and XS.
by Ovid (Cardinal) on Jan 15, 2005 at 00:13 UTC

    Bravo! I wrote AI::NeuralNet::Simple as a "starter" NN module, but I had always wanted to do more with it (different activation functions, different number of layers, etc.) Now I don't have to because I can reach for yours :)

    Cheers,
    Ovid

    New address of my CGI Course.

      Enjoy the sources. Now I'm planning to be able to set NN not based in layears, but only in the node connections, so, we will be able to set any type of NN.

      A good approach was to just serialize with Storable the Perl object of the NN, so, if we create a complex NN it can be saved like any other NN.

      Graciliano M. P.
      "Creativity is the expression of liberty".

Re: AI::NNEasy to setup fast a Neural Network using just Perl and XS.
by BrowserUk (Patriarch) on Jan 14, 2005 at 23:51 UTC

    Congratulations! On both the module, and your write-up. Both deserve huge praise.


    Examine what is said, not who speaks.
    Silence betokens consent.
    Love the truth but pardon error.
      Thank you for the nice reply! ;-P

      Graciliano M. P.
      "Creativity is the expression of liberty".

        One thing. There are a few warnings being issued from Class::HPLOO:

        Scalar value @ret[0] better written as $ret[0] at c:/Perl/site/lib/Cla +ss/HPLOO.pm line 784, <STDIN> line 2. "my" variable $class masks earlier declaration in same scope at c:/Per +l/site/lib/Class/HPLOO.pm line 993, <STDIN> line 2. Scalar value @ret[0] better written as $ret[0] at c:/Perl/site/lib/Cla +ss/HPLOO.pm line 1102, <STDIN> line 2. "my" variable $fh masks earlier declaration in same scope at c:/Perl/s +ite/lib/Class/HPLOO.pm line 1320, <STDIN> line 2.

        And also from you sample program:

        P:\test>422423 "my" variable $out masks earlier declaration in same scope at P:\test\ +422423.pl line 31. 0 0 => 0 0 1 => 1
Re: AI::NNEasy to setup fast a Neural Network using just Perl and XS.
by g0n (Priest) on Jan 15, 2005 at 12:25 UTC
    Hi,

    Just got back home and read your email & posting (I work away all week). NNEasy is nicely done.The XS code speeds it up by 10x you say?

    If its OK I'll take a closer look at your code and see what might be adaptable to NNFlex. I'm particularly interested in the speed improvements & the UI changes - NNFlex UI is very unfriendly at the moment..

    I wrote new methods for NNFlex yesterday to lesion networks BTW, so if you have a need for that you might want to take a look at 0.11 when I upload it & adapt the extra code.

    Nice to see the formerly rather quiet AI namespace come back to life so quickly - don't you just love perl!

    charles.

      Yes, I'm interested into this new methods. And if you want any help to port resource to NNFlex just ask.

      Perl rox! ;-P

      Graciliano M. P.
      "Creativity is the expression of liberty".

        OK, I've uploaded AI-NNFlex-0.11 to CPAN, with support for datasets a little bit like the Xerion approach. It makes the UI quite a bit friendlier - if you look at ./examples/xor_with_datasets.pl you'll see what I mean. PNG support has been put into ::draw and the lesioning method has been implemented. You can now damage the network probabilistically on a network-wide, layer or node basis.

        I've cleaned up some of the nastiest perldoc sections aswell.

        Quick question - I haven't done any work on the XS issue. I've never encountered XS before - does it have any prerequisites? Like a C compiler?

        Charles.

Re: AI::NNEasy to setup fast a Neural Network using just Perl and XS.
by g0n (Priest) on Feb 22, 2005 at 16:36 UTC
    Speeding things up

    I know this is a rather late addition to this topic, but I've been doing a lot of work on nnflex since it was first posted on CPAN at the beginning of Jan:

    gmpassos, if you're still working on this you might want to think about implementing a momentum term in backprop.
    Momentum works by adding a fraction (often about half, but you'll get the right value for your network by trial and error) of the weight change from the previous learning pass to the weight change for this pass. That way, when the network is a long way from converging, the weight changes are quite large, and become progressively smaller as the network nears convergence.

    My experiments with XOR suggest an improvement in learning speed (measured in the number of passes) of up to 5 fold, and improved network stability (because your network is less likely to get stuck in local minima).

    You can get the code from the momentum.pm module in nnflex.

    I hope thats useful.

    c

    VGhpcyBtZXNzYWdlIGludGVudGlvbmFsbHkgcG9pbnRsZXNz
      Is I'm still working on AI::NNEasy. Actually I have stoped to work on it to speed up another project, but this week I have just get back to the photo analysis project, where I have plans to build a new NN for the new vector format for the forms of the image. Just send me an e-mail with your new things and I will put it on NNEasy and make the XS functions.

      Graciliano M. P.
      "Creativity is the expression of liberty".

Re: AI::NNEasy to setup fast a Neural Network using just Perl and XS.
by jplindstrom (Monsignor) on Feb 22, 2005 at 21:54 UTC
    I'm a complete newbie when it comes to AI, so when you write "The project will be used to analyze texts on complex photos and identify them automatically.", what is it that you do here?

    Can you explain a little about the problem you try to solve, and how you use the NN to solve it?

    /J

Re: AI::NNEasy to setup fast a Neural Network using just Perl and XS.
by keymon (Beadle) on Jan 19, 2005 at 15:25 UTC
    Just a minor nit: I tried installing AI::NNEasy via CPAN, and it complained about "Class::HPLOO" not being found. Could you put it in as a dependency, so that CPAN will download and install Class::HPLOO first (if not installed) ?
      Actually this is a bug of CPAN.pm, since it's already there:
      use Class::HPLOO::MakeMaker ; WriteMakefile( 'NAME' => 'AI::NNEasy' , 'VERSION_FROM' => 'lib/AI/NNEasy.pm' , 'PREREQ_PM' => { 'Class::HPLOO' => 0.21 , 'Inline' => 0.44 , } , ($] >= 5.005 ? ( ABSTRACT_FROM => 'lib/AI/NNEasy.pm', AUTHOR => 'Graciliano M. P. <gmpassos@cpan.org>' ) : () ), );

      Graciliano M. P.
      "Creativity is the expression of liberty".

        How do you expect CPAN.pm to get past:

        use Class::HPLOO::MakeMa­ker ;

        to be able to run the code that tells it that it needs Class::HPLOO ?

        I looked, and CPAN.pm parses the Makefile so you should try including a pre-built Makefile in your distribution (that will get replaced). I also checked CPANPLUS.pm and it also parses Makefiles to find prerequisites.

        I don't know if including a Makefile will be enough to fix this problem, but you shouldn't just blithely blame the circular dependencies you've created with Class::HPLOO on "a bug in CPAN.pm".

        Please uninstall your copy of Class::HPLOO, download your module(s) that use it from CPAN, and test that they install correctly.

        - tye        

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlmeditation [id://422423]
Approved by BrowserUk
Front-paged by BrowserUk
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others musing on the Monastery: (3)
As of 2024-10-06 11:04 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    The PerlMonks site front end has:





    Results (43 votes). Check out past polls.

    Notices?
    erzuuli‥ 🛈The London Perl and Raku Workshop takes place on 26th Oct 2024. If your company depends on Perl, please consider sponsoring and/or attending.