http://www.perlmonks.org?node_id=10432

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question: (files)

I'm searching large (500meg) log files. My CGI script usually takes 3-4 min. to search text. Are there any tricks (like dividing the logfile, etc.) that could make it more efficient?

Originally posted as a Categorized Question.

  • Comment on How can I process large files efficiently?

Replies are listed 'Best First'.
Re: How can I process large files efficiently?
by devslashneil (Friar) on Jun 25, 2003 at 01:04 UTC

    You could use Tie::File to bind an array to a file, and then perform searches on the array without loading the file into memory.

    "The file is not loaded into memory, so this will work even for gigantic files."

    Hope this helps

    - Neil

Re: How can I process large files efficiently?
by btrott (Parson) on May 06, 2000 at 02:05 UTC
    Here are some tips:
    • Are you loading the entire file into a variable or processing it line by line? Process it line by line for faster results, because you won't be thrashing memory.
    • Are you using regular expressions where you're searching for a variable? Make sure you're using the /o modifier if the variable is the same each time through the loop.