Beefy Boxes and Bandwidth Generously Provided by pair Networks
"be consistent"
 
PerlMonks  

Re: printout of large files. loop vs. flush at once

by BrowserUk (Patriarch)
on Apr 18, 2013 at 17:19 UTC ( [id://1029391]=note: print w/replies, xml ) Need Help??


in reply to printout of large files. loop vs. flush at once

I was wondering if there is any real difference between printing line by line vs. flushing out 1 big array, or multiline string.

Accumulating the lines in memory will be considerably slower.

Each time you concatenate a new line:

  1. A new chunk of memory the size of the existing accumulation + the new line will need to be allocated.

    As the accumulation gets larger, this will more and more frequently involve going ot to the OS to get the process' virtual allocation extended.

  2. Then all the data from the existing accumulation + the new line will be copied to the new allocation.
  3. Then the existing allocation will be freed

Ie. By the time you've accumulated a 1000 line file, the first line will have been copied 1000 times, the second 999 times...

In addition, when the entire thing has been accumulated and you come to writing it to the filesystem; the file cache will likely need to make a request to the OS to allocate sufficient cache memory for the thing; which in turn might force the OS to swap out (parts of) other running processes to accommodate it.

When writing line by line, the data will first be written to cache memory in pages and only queued for flushing to disk when a page is full; meanwhile, another page from the existing cache pool will be used to buffer subsequent output allowing your process to continue full speed whilst the flush to disk happens in parallel.

We are seeing cases of slowdown when printing out files

The first question you'll need to asnwer before you'll get sensible answers is, how are you detecting and measuring this slowdown?

With that information it might be possible to work out what is causing it.

It is doubtful if there is anything that you can do in your Perl code to prevent or alleviate this problem; but with more information, there might be some useful advice forthcoming.

Eg. What OS/version? What filesystem? How much data is being written? Are you running one instance of your program at a time or many concurrently? What else is using the file server? How fast is the network between you and the fileserver?


With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
"Science is about questioning the status quo. Questioning authority".
In the absence of evidence, opinion is indistinguishable from prejudice.
  • Comment on Re: printout of large files. loop vs. flush at once

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://1029391]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others having a coffee break in the Monastery: (4)
As of 2024-04-25 07:54 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found