Without seeing your code I don't know for sure but I think your problem is not Perl but I/O. I created this test program to create a file with 554,152 100 character records. I then reopen the file and split it into 10 character comma delimited fields using a regexp which I thought would slow.
use Time::HiRes qw/ gettimeofday /;
use strict;
my $starttime = gettimeofday;
open OUTFILE, ">file.txt" or die $!;
for (1 .. 554152) {
print OUTFILE "X"x100, "\n";
}
close OUTFILE;
print "Creating file took ", gettimeofday - $starttime, " seconds\n";
$starttime = gettimeofday;
open INFILE, "<file.txt" or die $!;
open OUTFILE, ">file1.txt" or die $!;
while (<INFILE>) {
chomp;
print OUTFILE join (",", /(.{,10})/g), "\n"
}
print "Splitting file using regexp took ", gettimeofday - $starttime,
+" seconds\n";
__OUTPUT__
Creating file took 8.515625 seconds
Splitting file using regexp took 1.5 seconds
This is on Win2k/Activeperl 806 using a fast P4 with a 10k rpm hard drive and 1Gb ram so the read is probably all from the disk cache. Check your code and make sure you aren't opening the output file for every line. I've seen people do that and slow things to a crawl.
--
flounder