Beefy Boxes and Bandwidth Generously Provided by pair Networks
Pathologically Eclectic Rubbish Lister

Re: processing large files

by voyager (Friar)
on Jul 04, 2001 at 01:34 UTC ( #93695=note: print w/replies, xml ) Need Help??

in reply to processing large files

Your problem appears to be that the data file is too big. That would indicate you are reading the whole file into memory and then processing. You need to switch to an alogrithm that processes each line as it is read (if possible). If this is the case, post the code you have so far.

I don't know about the -Duselargefiles switch, but it appears to refer to the perl scrpt size. Can anyone clarify?

Replies are listed 'Best First'.
Re: Re: processing large files
by filmo (Scribe) on Jul 04, 2001 at 22:44 UTC
    I agree. Since you indicated that it is a file of records (plural), it seems like there would be a way to read each record individually and accomplish your goals.

    In fact, can someone give an example of the need to read a >2gig file and process it as a chunk -- binary files not withstanding.
    Filmo the Klown

Re: Re: processing large files
by Anonymous Monk on Jul 07, 2001 at 01:08 UTC
    I think I am reading the file line by line
    here is some code from a test script I am trying to
    get to work:
    open(IN_FILE, "cat $fileName |" die "Can't cat inputfile: $!\n"; while (<IN_FILE>) { chop; ......some processing code... }

    any ideas?


Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://93695]
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others perusing the Monastery: (4)
As of 2018-07-15 23:33 GMT
Find Nodes?
    Voting Booth?
    It has been suggested to rename Perl 6 in order to boost its marketing potential. Which name would you prefer?

    Results (328 votes). Check out past polls.