I should switch to reading it in as a stream for the reason you stated (although I never expected 70 million nulls on a line), but I haven't done that in perl before while I have used the while(<file>) syntax many times to read one line at a time. The idea was to a short and dirty which worked fine until last week.
in reply to Re^3: Out of Memory
in thread Out of Memory
Still, my real question and reason for posting was a quest for the knowledge of what was happening internally that caused the 2nd statement to use more memory than the first... and a lot more memory than I expected. Per the second response, running a 5 million byte string through the 2nd statement consumed 320 MB of memory. That seems like a lot to me. 5 million bytes is what 5 mb?
I think the answer (as mentioned somewhere in this thread) is that its creating 5 million scalers with 1 char each. If there was 20 bytes of overhead per scalar, I could see how 5 mb becomes 320 mb (when you chain several statements together in a single line. Of course this assumes scalars have lots of overhead (again something i dont know about).
BTW thank you everyone who has responded so far. I appreciate the knowledge share.