Beefy Boxes and Bandwidth Generously Provided by pair Networks
XP is just a number
 
PerlMonks  

File copy size limit on Linux

by Anonymous Monk
on Dec 18, 2003 at 23:06 UTC ( [id://315686]=perlquestion: print w/replies, xml ) Need Help??

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hi Monks,

I'm trying to perform a very simple file copy on Linux and have run into what appears to be a possible file size copy limitation. Whenever I copy small files in my script less than 1 GB, it appears to work fine. However, I need to copy some very very large files greater than 4GB. When I run this simple script it appears that it works as I get no errors but the file never copies. Does anyone have any insights or meditations to offer? Thanks. Here is my simple script that I am running:
use File::Copy; use strict; copy("ers-as1.dsk","ers-as2.dsk");
The ers-as1.dsk file is 5.2 GB and when I check on this script after 45 min. no file has been copied. When I perform a "cp" at command line I can get the file to copy. I'm running Linux Redhat 2.1, (actually ESX server 1.5.2)

Replies are listed 'Best First'.
Re: File copy size limit on Linux
by TVSET (Chaplain) on Dec 18, 2003 at 23:35 UTC
    I'm trying to perform a very simple file copy on Linux and have run into what appears to be a possible file size copy limitation.

    There is no such limitation in Linux, since you say that you can copy file with the system "cp".

    When I run this simple script it appears that it works as I get no errors but the file never copies.

    You don't check for any errors, that's why you don't get any. Try doing something like copy($file1, $file2) || die "Error: $!";.

    I'm running Linux Redhat 2.1, (actually ESX server 1.5.2)

    Are you sure? ;)

      Hi Leonid,

      Very good suggestion, I placed the error code in my script and get the following error:
      Error: Value too large for defined data type at (eval 13)/usr/lib/perl5/5.6.0/p erl5db.pl:1510 line 2, <IN> line 2.

      That is what makes me think the File:Copy module might have some type of size limitation. Not sure where to go from here but thanks for the help. Also, I believe I'm running Redhat 8.1 (or whatever comes shipped with ESX server) I was referring to kernel 2.4 and not 2.1.
        Ok, better now. It seems to me that your version of perl was compiled without support for large files. You can verify that by executing "perl -V" from the command line and inspecting the output. You should see something like "-Duselargefiles" and/or "selargefiles=define". If you perl version was compiled without large files support, then you can rebuild it either from source or from .src.rpm (this might be easier, since you are using RedHat).

        Also, I would like to draw your attention to this paragraph in the perldoc File::Copy: "An optional third parameter can be used to specify the buffer size used for copying. This is the number of bytes from the first file, that wil be held in memory at any given time, before being written to the second file. The default buffer size depends upon the file, but will generally be the whole file (up to 2Mb), or 1k for file-handles that do not reference files (eg. sockets).". I doubt it will help you now, but it's good to have in mind. :)

Re: File copy size limit on Linux
by maverick (Curate) on Dec 19, 2003 at 15:38 UTC
    From (insert name here)'s post, it looks like Perl's copy function is opening one file, reading some bytes and then spitting them out to another file...for lots of small files, this is probably the most efficent way to go about it.

    But, I'd be willing to be that for very large files, it's probably faster to use 'system("cp file1 file2")'.

    What you're trading here is the time it takes to do the system call vs.the speed of a native executable that might be pulling tricks at the file system level.

    P.S. You're not running RedHat 2.1..it doesn't even have support for files > 2gb. :)

    /\/\averick

Re: File copy size limit on Linux
by Steve_p (Priest) on Dec 19, 2003 at 14:46 UTC

    Despite years of work, many UNIX and Linux application are still stuck with a 2GB file limit. This is because the maximum int on a 32-bit operating system is 2GB. My best suggestion is to attempt the find a place to logically split the files.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://315686]
Approved by HyperZonk
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others wandering the Monastery: (4)
As of 2024-04-16 17:52 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found