Beefy Boxes and Bandwidth Generously Provided by pair Networks
We don't bite newbies here... much
 
PerlMonks  

Re^2: selecting columns from a tab-separated-values file

by sundialsvc4 (Abbot)
on Jan 22, 2013 at 16:04 UTC ( [id://1014709]=note: print w/replies, xml ) Need Help??


in reply to Re: selecting columns from a tab-separated-values file
in thread selecting columns from a tab-separated-values file

This node falls below the community's threshold of quality. You may see it by logging in.

Replies are listed 'Best First'.
Re^3: selecting columns from a tab-separated-values file
by Lotus1 (Vicar) on Jan 24, 2013 at 16:47 UTC

    It turns out that BrowserUK's approach was an order of magnitude quicker than the other approaches presented here. I took the liberty of coding up Sundialsvc4's suggestion of buffering a few hundred thousand lines worth of data before printing them out to a file. I used array refs since there was no reason to use hash refs.

    I used a Powershell script to time the different approaches.

    echo 'BrowserUK 1024' > bench.txt Measure-Command { .\buk.bat } >> bench.txt echo 'BrowserUK 2048' >> bench.txt Measure-Command { .\buk2048.bat } >> bench.txt echo 'Sundial approach by Lotus1' >> bench.txt Measure-Command { perl sundial.pl test.csv > dialout.csv} >> bench.txt echo spacebar >> bench.txt Measure-Command { .\sed -n "s/^\([^\t]*\t\)[^\t]*\t\([^\t]*\t\)[^\t]*\ +t[^\t]*\t\([^\t]*\)\t.*/\2\1\3/p" test.csv > spaceout.csv} >> bench.t +xt echo 'kenosis/choroba' >> bench.txt Measure-Command { perl kc.pl test.csv > kcout.csv} >> bench.txt echo 'mildside -- wrong order but testing anyway' >> bench.txt Measure-Command { .\cut '-f1,3,6' test.csv > cutout.csv} >> bench.txt

    Here are the results using a 1Gb test file on an idle server with 16 cores and 8Gb RAM: update: (Windows Server 2008 R2)

    BrowserUK 1024 Minutes : 1 Seconds : 54 Milliseconds : 103 Ticks : 1141033211 BrowserUK 2048 Minutes : 1 Seconds : 55 Milliseconds : 124 Ticks : 1151241665 Sundial approach by Lotus1 Minutes : 21 Seconds : 53 Milliseconds : 28 Ticks : 13130283183 spacebar Minutes : 21 Seconds : 24 Milliseconds : 215 Ticks : 12842154788 kenosis/choroba Minutes : 22 Seconds : 4 Milliseconds : 865 Ticks : 13248658887 mildside -- wrong order but testing anyway Minutes : 22 Seconds : 19 Milliseconds : 295 Ticks : 13392954883

    Here is the sundialsvc4 approach that I put together for the test:

    #! perl -sw use strict; my $count = 0; my @lines; while( <> ) { $count++; my @f = ( split /\t/, $_, 7 )[ 0, 2, 5 ]; push @lines, \@f; if( $count >= 300000 ) { #size of buffer $count = 0; print_buffer( \@lines ); } } print_buffer( \@lines ); sub print_buffer { my ($aref) = @_; foreach (@$aref) { print join( "\t", @$_ ) . "\n"; } splice( @$aref ); }

    Here is the Kenosis/choroba approach.

    #! perl -sw use strict; #kenosis/choroba approach while( <> ) { print join( "\t", ( split /\t/, $_, 7 )[ 0, 2, 5 ]), "\n"; }

      Great job there Lotus1.

      I'm curious about the use of splice to clear the array in your code, as below:

          splice( @$aref );

      I would probably have used the below:

          @$aref = ();

      Is splice faster or better in some other way?

        Thanks. I started with assigning the empty array but I had a bug somewhere so I stuck the splice in and got it working. I think I forgot the '@' in the first try but put it in with splice. I don't know which is faster but it is only called a handful of times in this approach anyway.

Re^3: selecting columns from a tab-separated-values file
by BrowserUk (Patriarch) on Jan 22, 2013 at 17:15 UTC
    taking care that I/O doesn’t sneak up on you from the backside in the form of virtual-memory paging.

    You and your hobby horses. Virtual memory limits don't enter into it.

    When running, those two programs I posted use 2.7 MB and 3.2 MB respectively when using the 256 times larger read size than standard that I suggest. Even if I increase that 10-fold to 10MB each -- which slows the processing down -- they use a whole 12 MB & 16 MB. The programmer that runs my heating system could handle that.

    I’m personally not sure that threads would help here

    I'm not sure that standing on one leg whilst drinking blood from a freshly decapitated chicken would help; so I don;t mention it.

    And your instincts prejudiced guessing is wrong!

    Approximately 2/3rds of the throughput gains from my posted solution come exactly because the CPU intensive part of the processing -- the spliting and joining of the records -- can run flat out (100% utilisation) on one CPU, whilst the first thread doing the Input is instantly ready to respond to the completion of each read because it isn't having to perform any CPU intensive processing on each record.

    push it onto an array (of hashrefs).

    The input is a stream; the output is a stream; What on earth do you need an array of hashrefs for?

    I suspect that you will be astonished at what just-this does for the program.

    No. I can pretty much call it without trying it. It will run 3 to 5 times slower than a simple:

    perl -F"\t" -anle"print join chr(9), @F[2,0,5]" in.tsv > out.tsv

    That is, instead of taking the OPs 5+ hours to run it will take closer to 24 hours.

    Why? You'll never know unless you try it. And you won't do that.

    (And even if you get someone else to do it for you, you won't post the results, because it will show your 'advice' to be nothing more than groundless guessing.)


    With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.
    A reply falls below the community's threshold of quality. You may see it by logging in.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://1014709]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others goofing around in the Monastery: (4)
As of 2024-03-28 14:35 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found