Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl-Sensitive Sunglasses
 
PerlMonks  

Re^2: Reading huge file content

by downer (Monk)
on Dec 06, 2007 at 14:02 UTC ( [id://655407]=note: print w/replies, xml ) Need Help??


in reply to Re: Reading huge file content
in thread Reading huge file content

in my experience, perl has some problems reading vary large amounts of data into a single variable. it may be better if you can load each line into an entry of an array, (if you've got plenty of memory) or better yet, try to just process a line at a time. For very large jobs like this, I find it preferrable to take one large job and make it in to several small jobs which usually just scan through the file. its probably work while to spend some time thinking of your algorithm to see if this is possible.

Replies are listed 'Best First'.
Re^3: Reading huge file content
by dsheroh (Monsignor) on Dec 06, 2007 at 15:23 UTC
    or better yet, try to just process a line at a time.

    Indeed. Especially since even reading this much data (never mind processing it) will take considerable time. Reading one line, processing it, then reading the next would also allow you to store a counter on disk somewhere or displayed on the screen with the number of bytes successfully read and processed (obtained using tell) so that, if the process dies partway through, it can pick up from where it left off when restarted instead of having to go back to the beginning of the file and repeat what's already been done.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://655407]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others contemplating the Monastery: (6)
As of 2024-04-23 18:15 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found