http://www.perlmonks.org?node_id=641946


in reply to Re: Slurping a large (>65 gb) into buffer
in thread Slurping a large (>65 gb) into buffer

Not sure if this would be an acceptable solution, or rather the start of one but if you have some process which reads all that html into a file, say, off the web. You could dump the pages using Storable and save some disk space. And perhaps some time in processing when reading back.
#!/usr/bin/perl use strict; use warnings; use Storable; use Data::Dumper; $/ = '---this line is the separator---'; open my $html, 'test.html' or die "unable to open test.html: $!\n"; my %PAGE; my $pagecount = 0; while (<$html>) { $_ =~ s#$/##; # strip the separator line $pagecount++; $PAGE{$pagecount} = $_; store \%PAGE, "Test.file"; #print "Data to process: \n$_\n"; } my $pages = retrieve('Test.file'); print Dumper $pages;
Ted
--
"That which we persist in doing becomes easier, not that the task itself has become easier, but that our ability to perform it has improved."
  --Ralph Waldo Emerson