Beefy Boxes and Bandwidth Generously Provided by pair Networks
Problems? Is your data what you think it is?
 
PerlMonks  

by amit1mtr (Initiate)
on Dec 15, 2012 at 11:23 UTC ( [id://1008972]=perlquestion: print w/replies, xml ) Need Help??

amit1mtr has asked for the wisdom of the Perl Monks concerning the following question:

Replies are listed 'Best First'.
Re: Merge columns from two different files
by davido (Cardinal) on Dec 15, 2012 at 11:37 UTC
    Re: Merge columns from two different files
    by CountZero (Bishop) on Dec 15, 2012 at 14:08 UTC
      Please put your data inside <code> ... </code> tags. Otherwise we cannot see where the lines start and end.

      Update: I see that each block in both files have a 6_Fast_HLR_activations_(CP) item. How to deal with that? Take the average within each file? Compare it on a block by block basis (but then you get 3 output lines, one for each block)? Take the smallest from one file and the largest from the other file? Or something else entirely?

      CountZero

      A program should be light and agile, its subroutines connected like a string of pearls. The spirit and intent of the program should be retained throughout. There should be neither too little or too much, neither needless loops nor useless variables, neither lack of structure nor overwhelming rigidity." - The Tao of Programming, 4.1 - Geoffrey James

      My blog: Imperial Deltronics

    Log In?
    Username:
    Password:

    What's my password?
    Create A New User
    Domain Nodelet?
    Node Status?
    node history
    Node Type: perlquestion [id://1008972]
    Approved by mbethke
    help
    Chatterbox?
    and the web crawler heard nothing...

    How do I use this?Last hourOther CB clients
    Other Users?
    Others rifling through the Monastery: (4)
    As of 2025-05-25 02:10 GMT
    Sections?
    Information?
    Find Nodes?
    Leftovers?
      Voting Booth?

      No recent polls found

      Notices?
      erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.