Here's what I might do in this situation to avoid ever-deeper nested data: store each file as a scalar (works best if you know the data won't get *too* large). You could use a straight hash, where the keys are the filenames and the values are the contents of the files, stored as scalars. You could, when printing the data, get out the info you wanted. Here's some code:
#!/usr/bin/perl -w
# ...
# I assume @files holds the list of filenames
my %data;
foreach (@files) {
# this will allow us to store the whole file as a string!
{
local $/ = undef; #
open FILE, $_ or die "Couldn't open $_: $!\n";
$data{$_} = <FILE>;
close FILE;
}
}
# to print :
foreach (sort keys %data) {
my ($city, $id) = split /_/, $_;
print "City: $city, ID : $id\n";
print "====\n\n";
print $data{$_}, "=====\n";
}
This way of representing might not be maximally efficient for searching, but
if the data set's not *too* huge it won't be a worry. If your data set were to get *huge*, then look into an RDBMS; MySQL is available under the GPL =)
Philosophy can be made out of anything. Or less -- Jerry A. Fodor |