http://www.perlmonks.org?node_id=1047417

dibre has asked for the wisdom of the Perl Monks concerning the following question:

I've a code snippet which does some sequence of action upon a file test operation. However the perl which i use is old and uselargefiles='UNKNOWN'. The code does not recognise the $file ( > 2GB)instead it jumps to the else part as if the $file size was zero. Is there a way to make perl recognize the $file greater than 2GB without recompiling/upgrading to a new version or how to exit the code during encounters with $file >2GB (when its not able to recognise the $file)

if ( -s $file ) { .....} else {...}

Replies are listed 'Best First'.
Re: file test operation on files greater than 2gigs
by daxim (Curate) on Aug 01, 2013 at 10:48 UTC
    Inline::C and do the stat(2) call yourself?

    If I had the choice between that and brewing a new Perl, I'd do the latter - it's quicker and less effort.

      If I had the choice between that and brewing a new Perl, I'd do the latter - it's quicker and less effort.

      And you don't even need to brew one or upgrade, you can download a portable/relocatable/binary distribution and use it without messing up any existing ones :)

Re: file test operation on files greater than 2gigs
by Corion (Patriarch) on Aug 01, 2013 at 10:49 UTC

    Besides recompiling Perl, the only way is to call the OS tools, which hopefully know how to deal with file sizes larger than 32 bit:

    my $output= `ls -l "$filename"` # Parse the output of ls
      Probably better to use the output of stat(1). Trying to parse ls(1) is well known to be a bad idea.
      my $size = `stat -f %z yourfile.txt`;
        ...except that the GNU version uses -c %s for the format string argument. Ah well. Portability woes everywhere.