Ok. I've tried it and it works great, but with one key problem. It appears that when JSON encodes the parameters, it does so "by value". In other words, if my function call is supposed to take an argument by reference to a scalar, with the intent of modifying it, this method does not work. | [reply] |
Now is a good time to review perlipc.
The only way to modify values within a Perl process is to do it within that one Perl process. Either you use threads, or you set up an API to do that modification for you.
There are other approaches, like shared memory or debuggers, but until you've understood the limitations of IPC, it doesn't make sense to bring out the heavy guns.
| [reply] |
inter process communication works by bytes -- values -- you cannot pass references between processes -- you have to serialize them to bytes
| [reply] |
That's exactly the point I was glossing over when I recommended a dispatch table at the receiving end, and a JSON producer at the sending end..
The OP doesn't have to use JSON, but it's convenient, and well understood. The point is that the OP cannot simply pass references around, he has to pass plain old data. So a JSON packet that looks something like this:
'{"action":"tupitar","params":["this","that","other"]}'
...would be easy to interpret at the receiving end. And the dispatch table avoids the mess of going directly to symbolic subrefs.
Things get ugly if he's got to send huge amounts of data to the subroutines. But even there, an additional layer of indirection can help; instead of sending data as a parameter, send a link to the data; a filename, a row ID from a database, or whatever. But sending a Perl reference is an indirection that will lose its magic as it gets passed around. ...just like you wouldn't pass a C pointer to local data from one process to another.
| [reply] [d/l] |