http://www.perlmonks.org?node_id=1043370

Preceptor has asked for the wisdom of the Perl Monks concerning the following question:

I've got a need to run a virus scan on my server estate. I have some rather large filesystems to deal with - 70million files, 30Tbs of data sort of large. I've got rather a lot of filesystems to process too. So I hit the perennial problem of the virus checkers running so long to run their scheduled scan, that they never actually finish.

What I've started doing, is pull together a set of 'scanning servers', with a view to distributing the workload. For smaller filesystems, this is good enough - I can do one at a time, and have a crude queuing mechanism. But when I hit the larger filesystems, I've a need to break down the problem into manageable slices. I don't need to be particularly precise - if a chunk is between 1000 and 100,000 files, that's good enough granularity.

I'm thinking in terms of using 'File::Find' to build up a list, but this seems ... well, a bit inefficient to me - traversing a whole directory structure, in order to feed a virus scanner a list that'll... then traverse the file structure again. Can anyone offer better suggestions for how to 'divide up' an NFS filesystems, without doing a full directory tree traversal?

Update:OS is Linux, but I can probably wangle a rebuild to a Windows platform if it's useful. Virus scanner I'm using is Sophos, but it's triggered in much the same way as 'find' - hand it a path to traverse, and it 'does the business'.