If there is a process adding new files to the directory while your "tar up" script is running, then you need to face the twin issues of deleting files which haven't been put in the tar file, and putting empty or half-written files into the tarball. If possible, you need to be able to stop the process from adding any new files while the script is running; but if you can't, then the following should be safe.
in reply to Re^4: Finding and sorting files in massive file directory
in thread Finding and sorting files in massive file directory
Use the script I gave you above to, for example, move all files starting with 'b' into a b/ subdirectory. Then wait a few minutes, or however long it could reasonably take for the process to finish writing the current file, then from the command line, simply:
If the rm fails due to too many files, then write another perl script similar to the one above, but using 'unlink' to remove each file one by one.
$ tar -cfz .../some-path/b.tar.gz b/
$ tar -tfz .../some-path/b.tar.gz > /tmp/foo
View /tmp/foo in a text editor to see if it looks reasonable, then
$ rm -rf b/