ThingFish has asked for the wisdom of the Perl Monks concerning the following question:

I have a simple script that reads a small file (115k, 759 lines, 40 tab delimited columns per line) into an array and uses the Schwartzian Transform to sort on any given column. When I use this transform my process peaks out at approximately 5160000K, 40K over my 5MB of accessible address space (default on a chrooted ZEUS server, so Im told). The address space can only be bumped up another 2MB, so if my data source gets much larger, which it will, I'll be up the creek. Im surprised that I need 5MB to sort this little file, but at any rate I need another solution. Any ideas?
open FILE, "<$data_file"; @file_data = <FILE>; close FILE; @file_data = map { $_->[0] } sort { $b->[0] cmp $a->[0] } map { [ $_, (split(/\t/)) ] } @file_data; print qq(<TABLE BORDER="1">); for(@file_data){ print qq(<TR>); for(split(/\t/)){print qq(<TD>$_</TD>)}; print qq(</TR>); } print qq(</TABLE>);