Thanks for your reply. I am reading a CSV file(below) and creating a temporary complex hash structure as described to perform some operations. As you can see my $key is the time which should be only one per second and plug the data as hash/array.
NAME,06/01/2011,09:30:00,16.76,756,Q,00
NAME,06/01/2011,09:30:00,16.76,300,Q,00
NAME,06/01/2011,09:30:00,16.76,100,Q,00
NAME,06/01/2011,09:30:01,16.76,100,Q,00
NAME,06/01/2011,09:30:01,16.76,200,Z,00
NAME,06/01/2011,09:30:02,16.77,200,X,00
NAME,06/01/2011,09:30:02,16.77,200,X,00
Open 100k line CSV file and read using Text_CSV. I have to undef the hashes before jumping to next second.
my $csv = Text::CSV_XS->new();
open (FILE, "<", "$tickdir/$file") or die "Can't open CSV File:$!
+\n";
while (<FILE>) {
$csv->parse($_);
my @columns = $csv->fields();
my $key = str2time($columns[1].' '.$columns[2]);
my (@prices, @volumes) = ();
my %HOH = ();
$HOH{$key}{"name"} = $columns[0];
push(@{$HOH{$key}{"price"}}, $columns[3]);
push(@{$HOH{$key}{"volumes"}}, $columns[4]);
print Dumper(\%HOH);
}
close FILE;
OUTPUT: As you can see the key values are same.
$VAR1 = {
'1306935005' => {
'volumes' => [
'200'
],
'name' => 'NAME',
'price' => [
'16.76'
]
}
};
$VAR1 = {
'1306935005' => {
'volumes' => [
'400'
],
'name' => 'NAME',
'price' => [
'16.76'
]
}
};