Beefy Boxes and Bandwidth Generously Provided by pair Networks
We don't bite newbies here... much

Re: Efficient processing of large directory

by Elliott (Pilgrim)
on Oct 03, 2003 at 13:36 UTC ( #296228=note: print w/ replies, xml ) Need Help??

in reply to Efficient processing of large directory

Thank you all for your advice. My reading of your answers is that two approaches would help:

  • Reorganise into subdirectories as I originally thought
  • Use while instead of foreach
I have already converted to using while but I haven't had the chance to check for improvement yet. Can I have opinions on whether using both solutions together is worthwhile (pun not intended!)?

BTW, the file names are email addresses (opt-in list, no spam here I promise!!) with \W characters removed. I was planning to pick 2nd and 4th chars to name the subdirectories in order to avoid grubbiness. Any thoughts on that?

Comment on Re: Efficient processing of large directory
Select or Download Code
Replies are listed 'Best First'.
Re: Re: Efficient processing of large directory
by tilly (Archbishop) on Oct 05, 2003 at 04:04 UTC
    Switch to a dbm like DB_File instead of lots of small files. Particularly if you use a BTree, you will get much better organized usage of disk.

    But do keep text backups in case an upgrade breaks DB_File.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://296228]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others examining the Monastery: (4)
As of 2015-11-28 21:15 GMT
Find Nodes?
    Voting Booth?

    What would be the most significant thing to happen if a rope (or wire) tied the Earth and the Moon together?

    Results (745 votes), past polls