Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling
 
PerlMonks  

Re^3: map-like hash iterator

by Aristotle (Chancellor)
on Nov 06, 2002 at 21:26 UTC ( #210915=note: print w/ replies, xml ) Need Help??


in reply to Re: Re: map-like hash iterator
in thread map-like hash iterator

The idea was to be able to use map to build the return list. Obviously that makes little sense in void context, which is all your for proposition will be able to provide.

undefs are actually less wasteful than actual values - my @x = (undef) x 10,000,000; consumes 130MB for me, my @x = ('a'x10) x 10,000,000; nearly hits the 500MB mark.

Of course, if you're not in void context and actually intent on returning the resulting list from processing a 10,000,000 key hash, you'll have to be able to fit that in memory anyway. Since you're throwing around big chunks of data, memory can't be a huge concern, otherwise you'd be walking the hash manually and chewing the bits more carefully.

You can't have your cake and eat it - you can't be using an iterator when you're concerned about memory usage.

Makeshifts last the longest.


Comment on Re^3: map-like hash iterator
Re: Re^3: map-like hash iterator
by jdporter (Canon) on Nov 06, 2002 at 21:32 UTC
    Of course, if you're not in void context and actually intent on returning the resulting list from processing a 10,000,000 key hash, you'll have to be able to fit that in memory anyway.
    Not necessarily always the case, though. The callback routine might never return anything -- except one time when it returns one thing. A jillion-key hash in, a one-element list out.

    Just like map.


    UPDATE

    You can't have your cake and eat it - you can't be using an iterator when you're concerned about memory usage.
    That is patently false. In fact, the built-in hash iterator (each) is all about efficiency -- in both space and time.

    There is no reason why iterators can't be efficient.

      That would be grep ;-) Still, that can be pretty wasteful. Why iterate over the entire 10,000,000 records even when the single one of interest is found at the very beginning? The iterators do not offer any early bailing mechanism.

      You have to look at the larger picture.

      Iterators mainly offer convenience. There are very few situations where iterators are useful under big efficiency concerns (be it memory or time), in the absence of lazy lists. Even so, you can't go wrong with the explicit loop construct.

      This ain't Ruby. :-) Perl 6 will, however, have lazy lists. (Is there anything Perl 6 won't fix? :-))

      Makeshifts last the longest.

        Why iterate over the entire 10,000,000 records even when the single one of interest is found at the very beginning? The iterators do not offer any early bailing mechanism.
        False. die works just fine.
        Iterators mainly offer convenience.
        And this is Perl. Sounds like a perfect match.

        Anyway, if you want to argue about it, let's meet in the chatterbox some time. Thank you for your comments.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://210915]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others chilling in the Monastery: (8)
As of 2014-12-22 10:52 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    Is guessing a good strategy for surviving in the IT business?





    Results (116 votes), past polls