Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl-Sensitive Sunglasses

Re: File Compression

by targetsmart (Curate)
on Jun 12, 2009 at 12:57 UTC ( #770946=note: print w/replies, xml ) Need Help??

in reply to File Compression

If you are using ext3 file system(common in Linux) , the maximum file size is 16 GiB to 2 TiB, depending on the block size.
First check with those compression tools for such options for splitting
otherwise, compress hundred files, then try to split into 2 GB pieces.

-- In accordance with the prarabdha of each, the One whose function it is to ordain makes each to act. What will not happen will never happen, whatever effort one may put forth. And what will happen will not fail to happen, however much one may seek to prevent it. This is certain. The part of wisdom therefore is to stay quiet.

Replies are listed 'Best First'.
Re^2: File Compression
by marto (Archbishop) on Jun 12, 2009 at 13:01 UTC

    "otherwise, compress hundred files, then try to split into 2 GB pieces."

    I'm not sure I follow your advice here, if the OP is complaining that, for what ever reason, they can't write files > 2GB, how are they going to do this?


      If I understood targetsmart right, he didn't mean "create an archive of all the files and then split into 2GB chunks", but "compress the files, then partition them into sufficiently small sets, then create the archives".

      Ronald Fischer <>

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://770946]
and a log crumbles through the grate...

How do I use this? | Other CB clients
Other Users?
Others rifling through the Monastery: (4)
As of 2018-06-20 00:24 GMT
Find Nodes?
    Voting Booth?
    Should cpanminus be part of the standard Perl release?

    Results (116 votes). Check out past polls.