Beefy Boxes and Bandwidth Generously Provided by pair Networks
laziness, impatience, and hubris
 
PerlMonks  

Re: processing large files

by voyager (Friar)
on Jul 04, 2001 at 01:34 UTC ( #93695=note: print w/replies, xml ) Need Help??


in reply to processing large files

Your problem appears to be that the data file is too big. That would indicate you are reading the whole file into memory and then processing. You need to switch to an alogrithm that processes each line as it is read (if possible). If this is the case, post the code you have so far.

I don't know about the -Duselargefiles switch, but it appears to refer to the perl scrpt size. Can anyone clarify?

Replies are listed 'Best First'.
Re: Re: processing large files
by filmo (Scribe) on Jul 04, 2001 at 22:44 UTC
    I agree. Since you indicated that it is a file of records (plural), it seems like there would be a way to read each record individually and accomplish your goals.

    In fact, can someone give an example of the need to read a >2gig file and process it as a chunk -- binary files not withstanding.
    --
    Filmo the Klown

Re: Re: processing large files
by Anonymous Monk on Jul 07, 2001 at 01:08 UTC
    I think I am reading the file line by line
    here is some code from a test script I am trying to
    get to work:
    open(IN_FILE, "cat $fileName |" die "Can't cat inputfile: $!\n"; while (<IN_FILE>) { chop; ......some processing code... }


    any ideas?

    -E

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://93695]
help
Chatterbox?
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others lurking in the Monastery: (4)
As of 2018-02-19 02:55 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    When it is dark outside I am happiest to see ...














    Results (258 votes). Check out past polls.

    Notices?