http://www.perlmonks.org?node_id=955034

cnd has asked for the wisdom of the Perl Monks concerning the following question:

I want to run a dozen simultaneous ford()'d perl scripts (each with it's own indiviaul processor affinity on a multi-CPU host). I want all of them to have efficient access to a large pool of mostly-static shared memory.

For example - I want to be able for every script to do this:-

print $shared{'hugedata'};

and this:-

$shared{'totalrequests'}++;

but for only 1 copy of all that to live in memory.

I specifically do not want to shuffle copies of stuff around, or to go serializing/unserializing everytying all the time.

Is this possible?

If not - how hard do you think it would be to extract the variable-handling functions out of the perl source, and compile it all into some kind of .xs loadable module?