Beefy Boxes and Bandwidth Generously Provided by pair Networks
No such thing as a small change
 
PerlMonks  

Multiple Perl files sharing a single socket - is it possible/sensible?

by ljamison (Sexton)
on Dec 01, 2015 at 21:52 UTC ( [id://1149083]=perlquestion: print w/replies, xml ) Need Help??

ljamison has asked for the wisdom of the Perl Monks concerning the following question:

Greetings Monks! I'm new as a user but not new to PM or programming (it has helped countless times in recent weeks during some tough projects!) and I am hoping that the wealth of knowledge here can assist with a problem I can't seem to wrap my head around!

I am trying to create workflow of sorts which uses socket connections to relay extracted MySQL data from localhost to LAN server and vice versa. I was successfully able to create each individual .pl file (8 files in all) and extract information through each file as necessary.

The part I am stuck on regards the actual socket connection for them to relay the data to the server. My concern is creating a bottleneck of sorts if (in the worst case scenario) all 8 files were to try sending data over the socket at the same time. Is it possible/sensible to create a separate file just containing socket information and allow that socket file to handle the relay for all files? If so, how could it be achieved? Is there another method that makes more sense?

  • Comment on Multiple Perl files sharing a single socket - is it possible/sensible?

Replies are listed 'Best First'.
Re: Multiple Perl files sharing a single socket - is it possible/sensible?
by NetWallah (Canon) on Dec 02, 2015 at 00:33 UTC
    Your process and communication structure is not clear, calling for more questions than answers.

    Are the 8 perl files running in separate perl instances (independent), or multiple threads of a parent process ? Do they open separate socket handles ?

    Do they (independently/parallelly?) send data to the same end point(s) ?

    How is the receiving end dealing with this ?

    Is data sent via a PUSH or PULL or POLL ?

    To reduce complexity, a message queue based implementation may be necessary.

            Our business is run on trust. We trust you will pay in advance.

Re: Multiple Perl files sharing a single socket - is it possible/sensible?
by FreeBeerReekingMonk (Deacon) on Dec 02, 2015 at 00:31 UTC

    mmm.. so you are trying to create a server process, that accepts data as fast as possible, and saves that to files on the harddisk, to be picked up later by a slow process that processes them. And the problem is with concurrency (no more than 8 clients connecting at the same time). And copying with scp (with .sshkeys, so you do not need a password, for example) will not do? This is a bit what MQ does.


    Now, usually, you schedule each server a bit differently, so that the results are not send all at the same time. If the data is not being processed right away, pre-compress the data before sending it. Maybe add a handshake to your protocol, for example: cant' talk now, busy, come back in 10 seconds (5 seconds + a random number). caveat: verification of receiving the data and crc's, retry, etc will have to be added for robustness if you roll your own.
    Now there are many client/server examples. There are many webservers, like httpi. Study those. As you see, the server listens on a port, but as soon as a connection is made, it forks a child process (and it counts the children, when it has more than x, it denies a new connection). The forked child process, using a virtual port, is able to receive data and process it at leisure (or, like you are asking, save it to a file (as $$.tmp, then when the file is closed, rename it to $$.dat or move it into a directory from where it will be processed) for a single process to pick it up)

    Scenario2: many clients on a local lan, and the server is far away. And you want a proxy of some sorts, and only the proxy makes contact with that remote server?

    Scenario3: You actually mean "named pipes" when you talk about "separate file just containing socket information". Thus, are asking if multiple processes can write to it, then be read by one process, and be send remotely. The latter: be careful, named pipes get full, and other processes can not write in between (or you get jumbled data). So you move the bottleneck to the local server. In this case, use local files and start an upload for each file. (or do all those 8 files need to arrive in a certain order on the server?)



    What I would choose:
    With the limited information I would go for running 8 scheduled scripts that extract different information from your single client (not sure if in parallel, or sequential), and write the data to a directory. Once all 8 are done, we start an uploader, that makes 1 connection to the server. The upload could be using scp -C to protect contect while traveling the network. The destination file is unique, for example, it contains the hostname. After sending the secure-copy file, a second file is send, which contains the MD5, so the server can know if the data is not corrupted, and send an email with an error message if there is something wrong with that file. Doing it like this allows you to follow each step and debug easily as looking inside directories. keep it simple, but as always, timtoady

      Using <hr>'s instead of blank lines to separate your paragraphs makes your post horrible to read.


      With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
      Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
      "Science is about questioning the status quo. Questioning authority". I knew I was on the right track :)
      In the absence of evidence, opinion is indistinguishable from prejudice.
Re: Multiple Perl files sharing a single socket - is it possible/sensible?
by Anonymous Monk on Dec 01, 2015 at 23:02 UTC
    Is it possible/sensible to create a separate file just containing socket information and allow that socket file to handle the relay for all files? If so, how could it be achieved?
    That reminds me of a recent thread, something like that should probably do what you want.
Re: Multiple Perl files sharing a single socket - is it possible/sensible?
by ljamison (Sexton) on Dec 02, 2015 at 14:09 UTC

    I apologize for the lack of clarification in my original question. Let me explain a bit further and clarify to get rid of any confusion:

    This workflow will be used in a library environment. Our library is migrating to Koha, which is built on Perl and MySQL. MY task is to interface Koha (which will be hosted by a third party off-site. For the remainder of this reply I'll identify Koha as the ILS - integrated library system - and our local server called EMS) with a local server on-site (EMS) which runs a machine we use to store and pick books from (read as replacement for bookshelves).

    As an example, a person would log in to the Koha library catalog and "Place Hold" on something. When they click "Place Hold" that would trigger a request file (request.pl) and collect contextual information regarding the request through MySQL extraction then relay that information via socket to EMS for the request to be completed.

    EMS has been designed by a third-party to accept Socket (SOCK_STREAM) connections over TCP protocol. EMS is set up to process data in the form of single line "messages". A robust ACK/NAK mechanism is in place to guarantee success or failure of a message which is handled by EMS.

    Parts of the actual interface connection are confidential, however I will try to answer the posed questions as best I can in hopes it will assist with a solid solution.

    To answer NetWallah's questions:

    1. The EMS server handles incoming requests in a FIFO manner. Both sides (ILS and EMS) are to have a SEND and RECEIVE function. The EMS side already has both configured but it is up to me to create them for the ILS side.
      • "The SEND task begins by attempting to connect to a socket. Once a connection is established then messages can be sent. After a message is sent, SEND should wait for an acknowledgement before sending the next message. If any errors occur, the socket should be closed and attempt to establish a new connection (on the same IP:Port). If the connection fails, SEND should periodically attempt to establish a connection."
      • "The RECEIVE task begins by creating a socket, binding to the socket and listening for a connection. After receiving a valid connection request RECEIVE will accept the connection and can begin receiving/responding to messages. RECEIVE will be responsible for preserving the message before sending the acknowledgement. If the connection is lost, RECEIVE should return to listening for a socket connection."
    2. Currently, I have been trying to work with 8 separate Perl files (messages), however all 8 could potentially be combined into one if I am able to access the necessary sub as needed.
    3. Though it would not necessarily happen constantly, it is more likely than not that more than one message may be triggered simultaneously.
    4. All 8 messages would send their data to one end point (via IP Address) on only one port.
    5. The receiving end (EMS) also has a SEND and RECEIVE task.
      • For the EMS SEND task, if we make an on-site local change, EMS will use the SEND task to notify ILS of the update. (The change is made locally, SEND task opens a socket, sends a message to ILS to notify of an update and waits for a response if the message was accepted/rejected then closes the socket.
      • For the EMS RECEIVE task, ILS opens a socket, sends a request message for an item to EMS, EMS sends a response message back to ILS that the request message was accepted/rejected, then EMS sends a status message to ILS telling ILS if the item is available or not, ILS sends a response that the status message was accepted/rejected, then closes the socket.
    6. And I'm impressed that you guessed right about it being message based!
    7. Lastly (to give a clearer picture), when the ILS RECEIVE task is active, it will make a SQL query to UPDATE the database.

    I hope that I provided enough clarification to clear up the confusion!

      It is entirely possible for the server side to be ready for multiple client connections. I would guess EMS has no problem if all eight of your client scripts connect to it at once. Is there some order which needs to be enforced on these messages, or are they fine to arrive in any order?

        I amend my statement as far as "all 8 messages at once" since most messages only go one way. Like so:

        • Response message - RECEIVE to SEND task on both sides (ILS and EMS will handle this message - contains code notifying that message was accepted or rejected by socket)
        • Ping message - SEND to RECEIVE task (this acts as the "keep alive" mechanism) Pings are sent from both sides at 40-second intervals.
        • Add message - ILS to EMS only
        • Delete message - ILS to EMS only
        • Return message - ILS to EMS only
        • Request message - ILS to EMS only
        • Status Check message - ILS to EMS only
        • Status message - EMS to ILS only

        As a visual representation, a typical transaction would follow this order:

        ILS EMS --- --- Request --> <-- Response (request accepted/rejected) <-- Status (requested item in available/unavailable) Response -->

      Too bad you already started, there are some softwares that help, for example if you can stand the way dancer2 is programmed (node.js for perl), you can have Dancer2::Plugin::Queue this will allow to receive many messages, and put into a single queue, which then can be polled to be processed.

      Then there are Message Queue in Perl and some nice abstractions in CPAN maybe worth looking into.
      I noted no encrytion requirements...

        That is correct...there is no encryption on these messages.

        I would be open to software solutions to the problem except that it would be more cumbersome than necessary to handle and maintain the MySQL data extraction AND the messages socket. I would be able to do it if we were hosting this Koha instance locally, but it will be hosted by a third-party.

Re: Multiple Perl files sharing a single socket - is it possible/sensible?
by ljamison (Sexton) on Apr 03, 2017 at 19:00 UTC

    Thank you for all help on this problem! I have made progress with this issue! For completeness of nodes, I'm linking the solution to all of my related nodes. Please see solution here

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://1149083]
Front-paged by Corion
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others making s'mores by the fire in the courtyard of the Monastery: (5)
As of 2024-04-23 22:27 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found