in reply to Re: Assining data to an array is slow..
in thread Assining data to an array is slow..
I understand that not assigning the data back to an array isn't useful. But it is still parsing the row, correct? My point was that to parse 2.1 million rows is fast. But storing it slows it considerably.
I think what's taking the time is the deletion and re-creation of the memory structure for each iteration of the loop. Uneccessary in my mind as the successive iterations (in this case) are always going to be of identical size.
Given the above, I was hoping for..
a) That the memory used by the unpack itself could be reused, rather than having to copy it to a perl structure.
or
b) That I could predefine the size of @data and not have it destroy with each iteration.
Of course.. I'm not a seasoned programmer and this entire thread is just a waste of everyones time in which case I apologize. 8)
I just think if unpack has to put it into it's own array(has to or how does it know what to send back) that assigning it to a perl data type shouldn't take at least 6 times as long. If course, I really don't know what the behind the scenes of perl actually does to store data in memory.
Regards,
Shawn M Ferris
Oracle DBA
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Re: Re: Assining data to an array is slow..
by chromatic (Archbishop) on Mar 01, 2001 at 05:18 UTC | |
by smferris (Beadle) on Mar 02, 2001 at 19:00 UTC |