|go ahead... be a heretic|
Efficient shared memory - possible? how??by cnd (Novice)
|on Feb 20, 2012 at 10:47 UTC||Need Help??|
cnd has asked for the
wisdom of the Perl Monks concerning the following question:
I want to run a dozen simultaneous ford()'d perl scripts (each with it's own indiviaul processor affinity on a multi-CPU host). I want all of them to have efficient access to a large pool of mostly-static shared memory.
For example - I want to be able for every script to do this:-
but for only 1 copy of all that to live in memory.
I specifically do not want to shuffle copies of stuff around, or to go serializing/unserializing everytying all the time.
Is this possible?
If not - how hard do you think it would be to extract the variable-handling functions out of the perl source, and compile it all into some kind of .xs loadable module?