Beefy Boxes and Bandwidth Generously Provided by pair Networks
Just another Perl shrine
 
PerlMonks  

Re^2: Adding the duplicate data using arrays

by changma_ha (Sexton)
on Aug 11, 2010 at 08:58 UTC ( [id://854273]=note: print w/replies, xml ) Need Help??


in reply to Re: Adding the duplicate data using arrays
in thread Adding the duplicate data using arrays

This node falls below the community's threshold of quality. You may see it by logging in.
  • Comment on Re^2: Adding the duplicate data using arrays

Replies are listed 'Best First'.
Re^3: Adding the duplicate data using arrays
by space_agent (Acolyte) on Aug 11, 2010 at 11:44 UTC

    Just to sum it up. Here (.*) takes both columns as * is a greedy quantifier

    and takes as much as possible and this is until the number (\d+) comes into play.

    while (<DATA>) { m/(.*)\s+(\d+)/; $hash{$1} += $2; #$1 is the fruit $2 the number } for my $fruit (keys %hash) { print "$fruit $hash{$fruit}\n"; } __END__ Apple Grape 100 Ginger Fry 200 Apple Grape 80 Ginger Banana 800 Ginger Fry 150 Ginger Banana 45
Re^3: Adding the duplicate data using arrays
by JavaFan (Canon) on Aug 11, 2010 at 09:14 UTC
    Ah, but you don't really have three columns. You just have two columns, with the first column being "Apple Banana", etc.

      Ok..JavaFan. If i have the data "Apple banana 100" if i split on the basis of "\s+" it will split Apple and Banana...so how can i split in such a way that "Apple Banana " is the key and "100" is the value...give me some wisdom

        my ($fruit, $value) = "Apple banana 100" =~ /^(.*\S)\s+([0-9]+)/;

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://854273]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others admiring the Monastery: (4)
As of 2025-06-21 11:37 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.