http://www.perlmonks.org?node_id=1014339


in reply to Re: Finding and sorting files in massive file directory
in thread Finding and sorting files in massive file directory

Seem to recall when I tried something like this it failed due to too many arguments being passed to the command. ie. you can't run a million or so files through gzip command?
  • Comment on Re^2: Finding and sorting files in massive file directory

Replies are listed 'Best First'.
Re^3: Finding and sorting files in massive file directory
by frozenwithjoy (Priest) on Jan 21, 2013 at 01:04 UTC
    If you put something like this in a while (there are files that start w/ a) loop, you should get around the issue of too many arguments. (Be sure to set increment=1 before the loop).
    mkdir dir_a.$increment mv `ls a* | head -n1000` dir_a.$increment/ let increment++
      The problem is you will get "too many arguments" from ls a* already.
      لսႽ† ᥲᥒ⚪⟊Ⴙᘓᖇ Ꮅᘓᖇ⎱ Ⴙᥲ𝇋ƙᘓᖇ
Re^3: Finding and sorting files in massive file directory
by MidLifeXis (Monsignor) on Jan 21, 2013 at 13:12 UTC

    See the xargs command. It can help with breaking apart large command lines.

    find . -type f -name b\* -depth -2 | xargs $command_that_can_be_run_mu +ltiple_times

    Update: added example.

    --MidLifeXis