http://www.perlmonks.org?node_id=962535

wrinkles has asked for the wisdom of the Perl Monks concerning the following question:

My script loops through a list of directory paths. In each loop the current directory is rsync'd to a common destination and then committed to a git repository.

My problem is that upon encountering a larger directories (e.g. 220 files, 1.5 MB), the script freezes after the rsync, apparently during the git add.

Is there a way around this? I could manually commit the whole destination directory manually, but I like to commit in smaller chunks, and I prefer to let the script do the work.

The script uses Git::Wrapper and File::Rsync. The git repo already exist before the script is run.

TIA!
chdir "$web_root"; my $repo = Git::Wrapper->new('.'); foreach my $plugin ( @plugins) { my $dir = File::Rsync->new({ exclude => $rsync_exclude, archive => 1 + }); my $src = $plugin . '/'; $dir->exec({src => $src, dest => $cgi_path}) or warn "$plugin : rsyn +c failed.\n"; say "$plugin installation was successful."; # final message before t +he script freezes on larger directories my $msg = "Plugin: $plugin"; chdir $web_root; $repo->add( '.') or warn "Successful git-add of $plugin \n."; $repo->commit({ all => 1, message => $msg }) or warn "Successful git + committed $plugin\n"; say "commit complete"; my $output = $repo->log(); say "$output"; }