Beefy Boxes and Bandwidth Generously Provided by pair Networks
The stupid question is the question not asked

Re: Perl Script Causing high CPU usage

by roboticus (Chancellor)
on Sep 29, 2013 at 15:29 UTC ( #1056227=note: print w/replies, xml ) Need Help??

in reply to Perl Script Causing high CPU usage


Doing a stat on every file is too time consuming. I'd suggest rather than rescanning the entire directory tree and checking the age that you make a text file containing the filename and last modified date of all the files known. Then you can quickly ignore many of the files without doing a stat. For the files that *are* candidate for removal, do the stat and update your text database. If a file is new (wasn't in the last scan) it's obviously not older than the previous scan, so you could simply enter it into your table with the previous rundate. Something like:

# Read text file from last run my %Files; open my $FH, '<', 'my_database'; while (<$FH>) { my ($YYYYMMDD, $FName) = split; $Files{$FName}=$YYYYMMDD; } # Retrieve last runtime my $LASTRUN = $Files{TheLastRunTime}; my $OLDEST_FILE_DATE = function_getting_cutoff_date(); # Scan file tree for my $FName (function_retreiving_file_list()) { if (! exists $Files{$FNAME}) { # New file, just add it to the database $Files{$FName}=$LASTRUN; } elsif ($Files{$FNAME} lt $OLDEST_FILE_DATE) { # Last recorded "modified" time is too old ...code to check last date and call archiver as needed... } else { # Nothing to do, we know this file was touched within # the cutoff period, so leave it until next time. } } # Rewrite the database file for next time $Files{TheLastRunTime} = function_returning_now_as_YYYYMMDD; rename 'my_database', 'my_database.".$Files{TheLastRunTime}; open $FH, '>', 'my_database'; for $K (keys %Files) { print $FH "$Files{$K} $K\n"; }

If you typically have 10,000 files of 1,90,000 files needing to be archived, then you're saving many stat calls--on some operating systems, anyway. I don't know the details on yours, but if you typically have the stat calls taking a significant amount of time, this approach will save you 1,80,000 stat calls per run.

Hope this helps...


When your only tool is a hammer, all problems look like your thumb.

Replies are listed 'Best First'.
Re^2: Perl Script Causing high CPU usage
by aaron_baugher (Curate) on Sep 30, 2013 at 01:07 UTC

    Another option to reduce the stat() calls would be to call it once at the beginning of the loop, saving the results in an array, and then referencing the array after that. Alternatively, call it once with the filename as the argument, and then for the rest of the loop replace the filename with the special filehandle "_" (underscore), which checks the cached results of the latest stat call. See stat for more info.

    Personally, on any system with unix-ish tools available, I'd move most of this into a find command, then use xargs to pass the list of only the files to be deleted/archived/whatever to a script which would do that.

    Aaron B.
    Available for small or large Perl jobs; see my home node.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://1056227]
and the monks are mute...

How do I use this? | Other CB clients
Other Users?
Others avoiding work at the Monastery: (3)
As of 2018-05-26 08:23 GMT
Find Nodes?
    Voting Booth?