Beefy Boxes and Bandwidth Generously Provided by pair Networks
Keep It Simple, Stupid
 
PerlMonks  

Re^4: reduce like iterators

by Anonymous Monk
on Jan 09, 2024 at 08:28 UTC ( [id://11156790]=note: print w/replies, xml ) Need Help??


in reply to Re^3: reduce like iterators
in thread reduce like iterators

my @b = @{ reduce { push @$a, $b if !@$a || $b ne $a->[-1]; $a } [], @ +a };
This is really clever and should be included in List::Util: https://rt.cpan.org/Ticket/Display.html?id=150773.
Note I had to disambiguate reduce by adding a '+'.

Replies are listed 'Best First'.
Re^5: reduce like iterators
by jdporter (Paladin) on Jan 09, 2024 at 14:05 UTC
    I had to disambiguate reduce by adding a '+'.

    Did you?

      "Did you?"

      I did.

      Syntax error without "+":

      $ perl -MO=Deparse -e 'my @b = @{ reduce { push @$a, $b if !@$a || $b +ne $a->[-1]; $a } [], @a };' syntax error at -e line 1, near "$b if" -e had compilation errors.

      Syntax OK with "+":

      $ perl -MO=Deparse -e 'my @b = @{ +reduce { push @$a, $b if !@$a || $b + ne $a->[-1]; $a } [], @a };' my(@b) = @{do { push @$a, $b if not @$a or $b ne $a->[-1]; $a }->reduce([], @a);}; -e syntax OK

      — Ken

        The issue isn't a missing + (which doesn't help parse the code correctly); the issue is that you didn't declare reduce (which has a prototype). Add

        use List::Util qw( reduce );

        But your note still shows the former (without), not the latter (with). I guess I'm confused.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://11156790]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others pondering the Monastery: (3)
As of 2026-04-18 14:13 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.