http://www.perlmonks.org?node_id=295965

Elliott has asked for the wisdom of the Perl Monks concerning the following question:

Brethren,

I have need to process a directory full of text files. It worked fine up to its planned limit of around 3000 files, but it's been too successful and the client now has 17,000 files in there!

The code currenlty looks like this

foreach$file(<directory/*.txt>) { do stuff }

Even if "do stuff" is merely to count them - or even just give the name of the first one and then exit; the process takes forever and often times out. (This is under Apache / Linux.) I assume the problem is that Perl(? or the OS?) has to read the entire directory listing into memory in order to structure the loop.

My thought is to restructure the directory into, say, 100 subdirectories and then process thus:

foreach$subdir(<directory/*>) { foreach$file(<$subdir/*.txt>) { do stuff } }

Will that help? My thought is that only two arrays will be created at any one time, one of 100 subdirectories and one of 170 files in the current subdirectory?

Or is there a much better way to do this?

BTW, the client tells me they hope to have 50,000 files in there soon...