in reply to Efficient processing of large directory
A side note first: Some operating systems are really, really bad with directories that large. If your client is interested in performance, they (or you, on their behalf) may want to do a bit of performance prototyping. Recent work on FreeBSD has greatly improved its large directory performance, for whatever that's worth.
For your problem, I see two options: The first is to use File::Find to locate all *.txt files, and process them one-by-one. The second is to use opendir()/readdir()/closedir() to read the directory directly, filename by filename. Either one will avoid you having to hold on to a large temporary array.
You can find plenty of examples of each by using Super Search to look for "File::Find" or "opendir".
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Re: Efficient processing of large directory
by BrowserUk (Patriarch) on Oct 03, 2003 at 00:10 UTC |