Beefy Boxes and Bandwidth Generously Provided by pair Networks
Pathologically Eclectic Rubbish Lister
 
PerlMonks  

Re: A (memory) poor man's hash

by Paulster2 (Priest)
on Nov 21, 2003 at 18:20 UTC ( #308993=note: print w/ replies, xml ) Need Help??


in reply to A (memory) poor man's <strike>hash</strike> lookup table.

Two things:

First:

They are simple to use

Being the newbie that I am, I still find hashes a little daunting. I understand what they are and basically how they are used, but implimenting them is another story. I have successfully modified others writs and made some of my own simple ones. I guess getting the info in is a lot easier than getting it out. So while others may feel that they are easy to use, I haven't gotten there yet. In other words, I guess that all of this comes down to opinion. I know that hashes are the life blood of PERL so I guess I better start figuring it out.

Second: While hashes may have a tendancy to soak up a lot of memory, it's not using it for that long. I haven't done the math, but I would bet if you put a dollar amount on the time wasted using other methods vs. the amount of bog that you may incur to other programs at the same time on any given system, you would find you have saved tons of money using hashes. How many clock cycles do you waste waiting on inefficient languages to do their thing?

Bottom line to my statement is by looking at the big picture, you may be surprised!

Paulster2

PS: I ++ you anyway, even though I don't agree. Mainly because I found it a stimulating writ.


Comment on Re: A (memory) poor man's hash
Re: Re: A (memory) poor man's hash
by hardburn (Abbot) on Nov 21, 2003 at 18:43 UTC

    I still find hashes a little daunting.

    BrowserUK (if I may presume) was speaking in relative terms. Hashes are quite easy compared to (for example) Perl's advanced data structures, object system, or closures.

    While hashes may have a tendancy to soak up a lot of memory, it's not using it for that long.

    The problem is that perl usually doesn't free the memory back to the OS until the process exits (though it will reuse that memory for other things). This is particularly a problem for mod_perl, where the process will stick around as long as Apache is up (potentially years, if your sysadmin is neglegent about security patches ;).

    ----
    I wanted to explore how Perl's closures can be manipulated, and ended up creating an object system by accident.
    -- Schemer

    : () { :|:& };:

    Note: All code is untested, unless otherwise stated

      MaxRequestsPerChild... or maybe Apache::SizeLimit

      ------------
      :Wq
      Not an editor command: Wq
      ...(though it will reuse that memory for other things). This is particularly a problem for mod_perl, where the process...

      One approach that I always use when a request is going to be quite resource consuming, is to let the child process die after the request is served. You can do this with $r->child_terminate. The extra CPU for forking a new child process is a lot less than the CPU you would lose when the server starts swapping.

      Liz

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://308993]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others imbibing at the Monastery: (11)
As of 2014-10-23 08:09 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    For retirement, I am banking on:










    Results (125 votes), past polls