Beefy Boxes and Bandwidth Generously Provided by pair Networks
The stupid question is the question not asked
 
PerlMonks  

Re^4: Huge perl binary file

by MisterBark (Novice)
on Jul 13, 2012 at 04:40 UTC ( [id://981554]=note: print w/replies, xml ) Need Help??


in reply to Re^3: Huge perl binary file
in thread Huge perl binary file

so the best would be to always have a sleeping perl process with the most frequently modules loaded with use ? :)
#!/usr/bin/perl use ....; use ....; use ....; while(1){ sleep(60); }

Replies are listed 'Best First'.
Re^5: Huge perl binary file
by mbethke (Hermit) on Jul 13, 2012 at 21:27 UTC
    That's basically what these "office quickstarter" thingies do, but it's not a good idea generally. If you have enough RAM, the stuff you want to load quickly will likely be in the buffer cache anyway. If you don't, forcing it to stay resident will just slow down other things.
Re^5: Huge perl binary file
by perl-diddler (Chaplain) on Aug 01, 2012 at 06:21 UTC
    That depends on what the module is...(and what you mean by "best")....

    In your example above, you 'use' all the modules which reads them in initially, but then you sleep in a while loop. never touching those modules.

    perl code isn't like binary modules since it can be executed -- modified and executed again.

    It is possible, but I very much doubt that all of the -read-only text parts of a module are stored in 1 area where they could be mapped to a read-only, Copy-on-Write memory segment.

    I'd say your best and maybe easiest bet would be to to have your module copy is to create a dir in /dev/shm/ (I needed a space to store some tmp info that I could examine -- later, shm's usage was removed, and I used str8 pipes, but had a master process, that forked off 'n' copies of itself to do queries on a large file list in rpm.

    I wanted to let them all dump their results in tmp files and exit when done -- the parent wouldn't have to try to multiplex the streams which would have created contention in my code with the parent and children contending for the lock.

    So instead, I created a tmpdir in /dev/shm to tmp files -- so no memory contention... and great thing was I could examine all of the intermediate results!

    So -- if you REALLY need to keep something in memory, -- put a tmp dir in there and create a perl /lib tree... with your needed modules --

    on my machine, /usr/lib/perl5 -- ALL OF IT (vendor, site, and a few archived previous releases)( only take up ~446M -- that's less than .5G, on a modern machine, not a major dent...depends on how important it is to keep things in memory!

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://981554]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others avoiding work at the Monastery: (4)
As of 2024-04-24 02:01 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found