Beefy Boxes and Bandwidth Generously Provided by pair Networks Joe
more useful options
 
PerlMonks  

Re^2: Perl always reads in 4K chunks and writes in 1K chunks... Loads of IO!

by ChOas (Curate)
on Jan 01, 2006 at 11:48 UTC ( [id://520261]=note: print w/replies, xml ) Need Help??

This is an archived low-energy page for bots and other anonmyous visitors. Please sign up if you are a human and want to interact.


in reply to Re: Perl always reads in 4K chunks and writes in 1K chunks... Loads of IO!
in thread Perl always reads in 4K chunks and writes in 1K chunks... Loads of IO!

This:

open DF, "test.txt" || die "Can't read 'test.txt': $!\n"

Does not do what you think it does.

The || ties itself to "test.txt", which is always true, and not to the return of the open.

This:
open(DF, "test.txt") || die "Can't read 'test.txt': $!\n"

or:

open DF, "test.txt" or die "Can't read 'test.txt': $!\n"

(or binds less tight than ||)

Would accomplish what you want.


GreetZ!,
    ChOas

print "profeth still\n" if /bird|devil/;

Replies are listed 'Best First'.
Re^3: Perl always reads in 4K chunks and writes in 1K chunks... Loads of IO!
by zebedee (Pilgrim) on Jan 03, 2006 at 21:01 UTC
    Why are you measuring under Windows to see what will happen on Unix?
    If you've only got one machine to play with, why not boot off a LiveCD (like Knoppix) and measure your code (or a key subset) under Linux?
    Might not be the same OS your ISP is using, but closer to Unix than Windows?
    Might make absolutely no difference, but at least you might be a bit closer to comparing apples to apples rather than apples (Unix) to oranges (Windows) ...

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://520261]
help
Sections?
Information?
Find Nodes?
Leftovers?
    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.