in reply to Perl Script Causing high CPU usage
The replies above about reducing how often you invoke "stat" will help some, but I would also tend to worry about having so many files in a single directory that everything gets bogged down, just because it takes so long to scan a directory that has so many files in it (especially if you're doing multiple stat calls on every file, instead of using all the information you get from a single stat call).
If you're seeing 190,000 files being created in less than a week (over 2700 a day), you might want to see if you can divide that up among different directories, to limit the number of files per directory. File age seems to be most important, and it's apparently ok to move things around, so you might want to try creating a directory for each date, and move files into daily directories according to their age. That would make things a lot easier to manage, in addition to reducing the overall load on both cpu and disk system.
|
---|
Replies are listed 'Best First'. | |
---|---|
Re^2: Perl Script Causing high CPU usage
by Anonymous Monk on Oct 01, 2013 at 01:24 UTC | |
by marinersk (Priest) on Oct 01, 2013 at 13:55 UTC | |
by swampyankee (Parson) on Oct 05, 2013 at 19:23 UTC |