OK, I have searched "Super Search" for an answer to my issue, and this is what I have gleaned.
I have a process that looks through an extracted | delimited flat file via a while (FILEHANDLE) for a user id. When I find an id, I split it based on the pipe, examine the data, and based on values, move it into a specific keyed hash to be processed in another part of the program.
My issue is that the flat files are enormous (well for me) at +250 MBs. Is there a way that I can tell perl not to rescan the file each time the while loop goes through?
#$userid already populated
#below is an example, there are actually 15
#variables coming out of this split.
($inv, $date, $amt)= split(/[|]/);
#code to build hash.
#processing of hashes
Is there anyway to tell perl, "hey, you have already gone through x lines, don't start over at the beginning of the file when you loop back through?