Beefy Boxes and Bandwidth Generously Provided by pair Networks
There's more than one way to do things

How can I process large files efficiently?

by Anonymous Monk
on May 06, 2000 at 01:08 UTC ( #10432=perlquestion: print w/replies, xml ) Need Help??

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question: (files)

I'm searching large (500meg) log files. My CGI script usually takes 3-4 min. to search text. Are there any tricks (like dividing the logfile, etc.) that could make it more efficient?

Originally posted as a Categorized Question.

  • Comment on How can I process large files efficiently?

Replies are listed 'Best First'.
Re: How can I process large files efficiently?
by devslashneil (Friar) on Jun 25, 2003 at 01:04 UTC

    You could use Tie::File to bind an array to a file, and then perform searches on the array without loading the file into memory.

    "The file is not loaded into memory, so this will work even for gigantic files."

    Hope this helps

    - Neil

Re: How can I process large files efficiently?
by btrott (Parson) on May 06, 2000 at 02:05 UTC
    Here are some tips:
    • Are you loading the entire file into a variable or processing it line by line? Process it line by line for faster results, because you won't be thrashing memory.
    • Are you using regular expressions where you're searching for a variable? Make sure you're using the /o modifier if the variable is the same each time through the loop.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://10432]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others scrutinizing the Monastery: (9)
As of 2021-04-21 12:18 GMT
Find Nodes?
    Voting Booth?

    No recent polls found