usertest has asked for the wisdom of the Perl Monks concerning the following question:

We have a requirement to read a source file, and for every row, apply some transformations and write to a output file. The code is however taking a long time approximately 25 seconds for processing a 500MB file. Please suggest if we can apply some performance improvements.
#!/usr/bin/perl use strict; use warnings; use POSIX qw(strftime); my $infile=$ARGV[0]; my $outfile=$ARGV[1]; open(DATAIN,"<$infile"); open(DATAOUT,">$outfile"); while(<DATAIN>) { my($line)=$_; chomp($line); my @Fields = split(',', $line,9); my $X=$Fields[8]; my $Y = substr $X,0,10; my $A = strftime "%M,%Y,%m,%d,%H,%j,%W,%u,%A", gmtime $Y; my $B = substr($A, 0, index($A, ',')); my $C = int($B/5); my $D = int($B/15); print DATAOUT $line,",",$Y,",",$A,",",$C,",",$D,"\n"; } close(DATAIN); close(DATAOUT);