http://www.perlmonks.org?node_id=485097

jimbus has asked for the wisdom of the Perl Monks concerning the following question:

Monksters,

I've been trying to write something to archive the files I'm processing and came up with this:

foreach $year (@years) { foreach $month (@months) { foreach $day (@days) { @filelist = glob("/home/reports/ftp/WSB/*$year$month$day*"); $filecount = @filelist; print "$month-$day $hour:$filecount\n"; $gzname = $month.$day.".gz"; foreach $filename (@filelist) { `cat $filename|gzip -9 >>$gzname`; `rm $filename`; } } @days = @alldays; } }
I don't like this because, its an after the fact clean up and I'd like to archive each file as I process it, It creates one monolith file and I've had some issues with gzip later claiming that these files are corrupt.

I tend to use gxip because thats what the people who taugh me Unix used. Are any of the others more reliable? I'd like to have them behave like they were tarred then zipped and I looked at some of the modules on cpan and it didn't look like any of them worked without going through the whole process of unzipping the whole archive, adding the new file to the tar and then rezipping. This seemed very expensive to me. I supposed taring isn't important if the files remain atomic when I uncompress the archive.

Anyhow, sorry to be asking so many newbie questions...

A hopeful Jimbus asks, "Never moon a werewolf?"