|Pathologically Eclectic Rubbish Lister|
Re^5: Huge perl binary fileby perl-diddler (Friar)
|on Aug 01, 2012 at 06:21 UTC||Need Help??|
That depends on what the module is...(and what you mean by "best")....
In your example above, you 'use' all the modules which reads them in initially, but then you sleep in a while loop. never touching those modules.
perl code isn't like binary modules since it can be executed -- modified and executed again.
It is possible, but I very much doubt that all of the -read-only text parts of a module are stored in 1 area where they could be mapped to a read-only, Copy-on-Write memory segment.
I'd say your best and maybe easiest bet would be to to have your module copy is to create a dir in /dev/shm/ (I needed a space to store some tmp info that I could examine -- later, shm's usage was removed, and I used str8 pipes, but had a master process, that forked off 'n' copies of itself to do queries on a large file list in rpm.
I wanted to let them all dump their results in tmp files and exit when done -- the parent wouldn't have to try to multiplex the streams which would have created contention in my code with the parent and children contending for the lock.
So instead, I created a tmpdir in /dev/shm to tmp files -- so no memory contention... and great thing was I could examine all of the intermediate results!
So -- if you REALLY need to keep something in memory, -- put a tmp dir in there and create a perl /lib tree... with your needed modules --
on my machine, /usr/lib/perl5 -- ALL OF IT (vendor, site, and a few archived previous releases)( only take up ~446M -- that's less than .5G, on a modern machine, not a major dent...depends on how important it is to keep things in memory!