Beefy Boxes and Bandwidth Generously Provided by pair Networks
Problems? Is your data what you think it is?
 
PerlMonks  

Re^3: Highly efficient variable-sharing among processes

by zwon (Abbot)
on Aug 29, 2016 at 20:14 UTC ( #1170740=note: print w/replies, xml ) Need Help??


in reply to Re^2: Highly efficient variable-sharing among processes
in thread Highly efficient variable-sharing among processes

Shared memory is a way of interprocess communication
  • Comment on Re^3: Highly efficient variable-sharing among processes

Replies are listed 'Best First'.
Re^4: Highly efficient variable-sharing among processes
by BrowserUk (Pope) on Aug 29, 2016 at 20:51 UTC

    Correction:

    Shared memory is can be a way of interprocess communication

    But the processes sharing memory do not have to communicate anything between them; the memory remains shared and readable by one without the other being in any way aware of the read.

    To both processes, the memory is a part of its process, and neither need communicate to the other in order to use it.

    No communication; no IPC.


    With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority". I knew I was on the right track :)
    In the absence of evidence, opinion is indistinguishable from prejudice.
      I think what is being discussed is copy-on-write. If the forked process doesn't write the data, there is no copy made. Many processes could read the same data without separate copies being made.

      PS: I looked on the internet for a machine of this size. It is indeed possible to get a 198 GB memory machine from Dell. Maybe specialty vendors offer more memory? But even if this machine can do it, it would seem that a "hot fail" with a backup machine would be necessary. I am still thinking that this thing is so big and mission critical that an N+1 network would be better?

        I think what is being discussed is copy-on-write.

        Why do you think that? And who do you think is discussing that?

        The OP mentions only "*other* processes to be able to do lookups". No mention of anybody writing to anything; by him or anyone else.


        With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
        Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
        "Science is about questioning the status quo. Questioning authority". I knew I was on the right track :)
        In the absence of evidence, opinion is indistinguishable from prejudice.
      But the processes sharing memory do not have to communicate anything between them; the memory remains shared and readable by one without the other being in any way aware of the read.

      One process has to fill the memory with data in order the other could read it. The fact that it doesn't aware about reads is not significant, communication doesn't have to be two way. My TV can receive signal without making TV tower aware of that, it's still communication.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://1170740]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others about the Monastery: (9)
As of 2020-05-26 13:44 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    If programming languages were movie genres, Perl would be:















    Results (150 votes). Check out past polls.

    Notices?