I wrote some code today to solve a co-worker's problem. Apparently a client was concatenating a series of XML files into one big file to send via email (or some other numbskull reason) & then needed them split up afterwards for processing.
Here's what I came up with. Being that TIMTOWTDI, which would be the Better/Stronger/Faster/SHORTEST possible way to accomplish the same thing:
use strict;
my @input_file = <>;
my @temp;
my $filecount = 0;
my $filename_prefix = "outfile_";
my $file_is_open = 0;
my $out_name;
foreach my $in (@input_file) {
if ($in =~ m/^<\?xml version/) {
if ($file_is_open) {
close(OF) or die "Couldn't close $out_name!!\n";
$file_is_open = 0;
}
++$filecount;
$out_name = $filename_prefix.$filecount.".xml";
open(OF,"> $out_name") or die "Couldnt open $out_name for writin
+g!!\n";
print(STDOUT "Writing to $out_name.\n");
$file_is_open = 1;
}
if ($file_is_open) {
print(OF $in);
}
}
close(OF);
My first thought was to use the command flag to put perl into looping mode, but I haven't gone there yet... Any other brilliant ideas? I find I learn more quickly when other people try to tackle the same problem as I & we can share insights.
Wait! This isn't a Parachute, this is a Backpack!