Beefy Boxes and Bandwidth Generously Provided by pair Networks
The stupid question is the question not asked
 
PerlMonks  

Re: Re: map-like hash iterator

by jdporter (Canon)
on Nov 06, 2002 at 20:51 UTC ( #210901=note: print w/replies, xml ) Need Help??


in reply to Re: map-like hash iterator
in thread map-like hash iterator

Yeah... but if the hash has 10_000_000 keys, it's probably no better to have a list of 10_000_000 undefs than a list of 10_000_000 actual key values.

Maybe this instead:

my $n = keys %$h; for ( my $i = 0; $i < $n; $i++ ) { local( $a, $b ) = each %$h; $c->() }

Replies are listed 'Best First'.
Re^3: map-like hash iterator
by Aristotle (Chancellor) on Nov 06, 2002 at 21:26 UTC

    The idea was to be able to use map to build the return list. Obviously that makes little sense in void context, which is all your for proposition will be able to provide.

    undefs are actually less wasteful than actual values - my @x = (undef) x 10,000,000; consumes 130MB for me, my @x = ('a'x10) x 10,000,000; nearly hits the 500MB mark.

    Of course, if you're not in void context and actually intent on returning the resulting list from processing a 10,000,000 key hash, you'll have to be able to fit that in memory anyway. Since you're throwing around big chunks of data, memory can't be a huge concern, otherwise you'd be walking the hash manually and chewing the bits more carefully.

    You can't have your cake and eat it - you can't be using an iterator when you're concerned about memory usage.

    Makeshifts last the longest.

      Of course, if you're not in void context and actually intent on returning the resulting list from processing a 10,000,000 key hash, you'll have to be able to fit that in memory anyway.
      Not necessarily always the case, though. The callback routine might never return anything -- except one time when it returns one thing. A jillion-key hash in, a one-element list out.

      Just like map.


      UPDATE

      You can't have your cake and eat it - you can't be using an iterator when you're concerned about memory usage.
      That is patently false. In fact, the built-in hash iterator (each) is all about efficiency -- in both space and time.

      There is no reason why iterators can't be efficient.

        That would be grep ;-) Still, that can be pretty wasteful. Why iterate over the entire 10,000,000 records even when the single one of interest is found at the very beginning? The iterators do not offer any early bailing mechanism.

        You have to look at the larger picture.

        Iterators mainly offer convenience. There are very few situations where iterators are useful under big efficiency concerns (be it memory or time), in the absence of lazy lists. Even so, you can't go wrong with the explicit loop construct.

        This ain't Ruby. :-) Perl 6 will, however, have lazy lists. (Is there anything Perl 6 won't fix? :-))

        Makeshifts last the longest.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://210901]
help
Chatterbox?
[1nickt]: choroba Updated the PR, using Config instead of exposing the hackery. Using cpanfile-dump does not work to test, as shown in the earlier PR comment, even when using a binary built for a particular Perlbrew perl, the output seems to be
[1nickt]: ... from the system perl binary. Using cpanm --installdeps . does use the expected perl and tus the check acts correctly.
[choroba]: Nice, thanks. I'll look into it later, about to leave ATM.
[1nickt]: Aye-aye cap'n 🖖

How do I use this? | Other CB clients
Other Users?
Others romping around the Monastery: (10)
As of 2017-10-18 16:07 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    My fridge is mostly full of:

















    Results (249 votes). Check out past polls.

    Notices?