in reply to Speed up hash initialization loop
If it's the same 300 files each time, you might see a big difference if you figure out how to restructure the looping so that you read each file exactly once, and populate all the profiles in that one pass over each file. But I'm only guessing, because you haven't provided enough info about the problem (number of profiles, total amount of data in the files, what manner of "system_command" are you running for each file).
Apart from that, anything you do to simplify the "getvars" code will help some; e.g.:
- don't use references to hashes and arrays when you don't need to ("prof_var_names" and "evnt_nums_ref" should just be plain hashes; you can return them as refs the same way you do "sorted_vars", and "vars" should just be @vars).
- use a "pipe open" to run your system command, read from the pipe until you see /^Events$/, then read the data of interest - i.e.:
sub getvars { my $profile = shift; my ( @vars, %evnt_nums, %prof_var_names, $last_evnt_name ); open( my $ptk_info, '-|', "system command here" ) or die "$profile +: $!\n"; while (<$ptk_info>) { last if ( /^Events$/ ); # skip lines till this line is found } while (<$ptk_info>) { my @tkns = split; if ( $tkns[0] =~ /^(\d*\.\d*)0/ ) { $last_evnt_name = $tkns[2]; $evnt_nums{$last_evnt_name} = $1; } push @vars, @tkns; } ... # (do other for loop, sort @vars return \%event_nums, \%prof_var_names, \@sorted_vars, $last_evnt_n +ame; }
|
---|
Replies are listed 'Best First'. | |
---|---|
Re^2: Speed up hash initialization loop
by austinj (Acolyte) on Jan 30, 2013 at 15:09 UTC | |
by Anonymous Monk on Jan 31, 2013 at 20:56 UTC |