Well, if the file is that huge and cannot be handled, maybe using "divide and conquer" works:
use strict;
my $file="filename";
my $file_size = 30720; #grab file size in MB.
my $chunks = 1226; #How many pieces
my $size = int $file_size / $chunks + 1;
my $counter = 0;
for(0..$chunks){
my $skip = $size * $counter;
`dd if=$file of=$file.$counter bs=1M count=$size skip=$skip`;
$counter++;
}
And will output 1226 files called filename.*. Note that you will need an amount of disk space equal to the file size, and be sure when reading parts to open chunks just before finding the text separator (ie new line character) on each chunk, as it might be distributed bewteen one or more chunks. Also, be sure to close processed files! :D