Beefy Boxes and Bandwidth Generously Provided by pair Networks
more useful options
 
PerlMonks  

How to read 1 td file

by bhaskar_219 (Initiate)
on Sep 24, 2010 at 09:14 UTC ( #861758=perlquestion: print w/ replies, xml ) Need Help??
bhaskar_219 has asked for the wisdom of the Perl Monks concerning the following question:

Hi,

i need to read 1 terabyte file by perl but harddisk size is less than that one then how to read that file?

Comment on How to read 1 td file
Re: How to read 1 td file
by moritz (Cardinal) on Sep 24, 2010 at 09:20 UTC
    I don't understand your problem. If the file does not fit on the hard disc, where is it stored? Or how is it made available to you? How does Perl come into play?
    Perl 6 - links to (nearly) everything that is Perl 6.

      question asked by one of company. I think file may be in remote host

        Well if you're not sure of the details of the problem what chance have we got advising you?

        I suggest you actually investigate the problem, finding out exactly what is involved. If you have any Perl specific issues, let us know.

Re: How to read 1 td file
by zentara (Archbishop) on Sep 24, 2010 at 10:14 UTC
    1 idea off the top of my head....

    You could try the receive callback feature of LWP::UserAgent. It lets you access the incoming $data chunk, so you could work on each chunk ( or a sliding buffer of multiple chunks), then send the $data chunks off to /dev/null.

    #!/usr/bin/perl -w use strict; use LWP::UserAgent; # don't buffer the prints to make the status update $| = 1; my $ua = LWP::UserAgent->new(); my $received_size = 0; #simulate a big file my $url = 'http://www.cpan.org/authors/id/J/JG/JGOFF/parrot-0_0_7.tgz' +; print "Fetching $url\n"; my $request_time = time; my $last_update = 0; my $response = $ua->get($url, ':content_cb' => \&callback, ':read_size_hint' => 8192, ); print "\n"; # note this callback dosn't save the file, because nothing is done # with data sub callback { my ($data, $response, $protocol) = @_; my $total_size = $response->header('Content-Length') || 0; $received_size += length $data; my $time_now = time; # this to make the status only update once per second. return unless $time_now > $last_update or $received_size == $total_s +ize; $last_update = $time_now; print "\rReceived $received_size bytes"; printf " (%i%%)", (100/$total_size)*$received_size if $total_size; printf " %6.1f/bps", $received_size/(($time_now-$request_time)||1) if $received_size; }

    I'm not really a human, but I play one on earth.
    Old Perl Programmer Haiku ................... flash japh
Re: How to read 1 td file
by roboticus (Canon) on Sep 24, 2010 at 10:47 UTC

    bhaskar_219:

    That's really not a perl question. But you can use dd or some other tool to split the file into chunks that you can copy to your hard disk.

    ...roboticus

Re: How to read 1 td file
by DrHyde (Prior) on Sep 24, 2010 at 10:59 UTC

    You'd do it exactly the same as if the file *would* fit on your disk:

    local $/ = "\n"; # just in case its been undeffed - don't wanna slurp +a TB file! open(my $fh, "/mnt/remotehost/bigfile") || die("Can't open: $!\n"); while(my $line = <$fh>) { ... do stuff ... } close($fh);

    Hope that helps!

Re: How to read 1 td file
by BrowserUk (Pope) on Sep 24, 2010 at 11:39 UTC

    Most sensible creators of files that size, make them binary data with fixed-sized records. That way, individual records can be read without having to process the whole file in one serial progression. Or in one go.

    Then all you need is a) the ability to see the remote file system from the local machine; b) the ability to seek or sysseek to positions > 2GB.

    • Achieving the former will be platform dependant and nothing to do with Perl.

      Under windows it is usually done with something like  NET USE \\remotemachine\remotedir x:

      For *nix, it will be some variation on the mount command.

    • For the latter, you need to check that the local perl was compiled with large file support.

      It is probably quite rare that anyone would build perl without this these days, but to verify it, you can do:

      perl -V:uselargefiles uselargefiles='define';

      If it says ='define', you're good to go. If is says: ='undef', you need to reinstall or recompile perl on your system.

    So yes. Given the right infrastructural access to the remote file, and a suitable build of Perl, you can access that remote, terabyte-sized file using Perl.

    Access will be relatively slow compared to local disc, but that's inevitable. Depending upon the nature of the processing involved, it may make sense to pull chunks of the file across to local disc for processing, before writing in-place back to the master; if required.

    If accessing such large remote files is likely to become a regular requirement, and your machine is physically close (<100m) to the remote server, then setting up a dedicated 10 gigabit network connection is a possibility; though relatively expensive. A couple of 10GBASE-T cards ($1000), and a few meters of cable, and you can have a dedicated network between you and the remote file that runs fast enough to rival a locally attached disc. Whether the local and remote hardware is capable of exploiting that bandwidth is another question.


    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.
Re: How to read 1 td file
by JavaFan (Canon) on Sep 24, 2010 at 16:19 UTC
    i need to read 1 terabyte file by perl but harddisk size is less than that one then how to read that file
    Use 1Tb of RAM as a ramdisk? Costly, and perhaps not possible on toy OSses, but not impossible.

      Care to identify a non-toy OS that can handle 1TB of ram in a single box? And while you're at it, a non-toy box that can handle it also?

        Care to identify a non-toy OS that can handle 1TB of ram in a single box?
        Solaris.
        And while you're at it, a non-toy box that can handle it also?
        Sun Fire E25K (End of life, so you may buy it at a bargain on eBay ;-)).
        Sun SPARC Enterprise M9000 Server (takes up to 4TB).

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://861758]
Approved by marto
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others about the Monastery: (16)
As of 2014-08-27 14:02 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    The best computer themed movie is:











    Results (238 votes), past polls