Beefy Boxes and Bandwidth Generously Provided by pair Networks
We don't bite newbies here... much
 
PerlMonks  

Re^4: Splitting a Blocked file in Round Robin into smaller files

by BrowserUk (Patriarch)
on Dec 14, 2015 at 17:39 UTC ( [id://1150255]=note: print w/replies, xml ) Need Help??


in reply to Re^3: Splitting a Blocked file in Round Robin into smaller files
in thread Splitting a Blocked file in Round Robin into smaller files

Would the flush happen regardless?

Wrong question. The one you should be asking yourself is: why do you feel the need to to defeat the whole purpose of buffered IO?

Is your hardware so unreliable or your code so flaky?

Beside which your efforts are of limited benefit as every modern OS also buffers files in the system cache anyway.

In the very rare circumstances that you have a real reason to avoid buffered IO, why not just set autoflush on the file handle with IO::Handle::autoflush()

Or do it manually with IO::Handle::flush() within the loop.

Forcing the system to keep opening and closing the files in order to achieve flushing, all for very limited benefit and no good reason, is very silly and hugely expensive.


With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority". I knew I was on the right track :)
In the absence of evidence, opinion is indistinguishable from prejudice.

Replies are listed 'Best First'.
Re^5: Splitting a Blocked file in Round Robin into smaller files
by KurtSchwind (Chaplain) on Dec 15, 2015 at 20:33 UTC

    I'm not sure it's a wrong question, or even a question of flaky code or buggy hardware, per se, but I do concede your following points. I could call autoflush or call flush manually.

    While it's true that there is a degree of buffering by OS and even by DASD, you can still eliminate one degree of buffering if it's applicable. I'm used to writing system logs for audit and government organizations, so the accepted pattern is to flush often. I get that for this example it's probably over-kill. Especially given that the input file isn't destroyed/altered as it's processed.

    --
    “For the Present is the point at which time touches eternity.” - CS Lewis

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://1150255]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others rifling through the Monastery: (6)
As of 2024-04-23 21:10 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found