Beefy Boxes and Bandwidth Generously Provided by pair Networks
Clear questions and runnable code
get the best and fastest answer
 
PerlMonks  

Re: Combining Ultra-Dynamic Files to Avoid Clustering (A better way?)

by BrowserUk (Pope)
on Jul 25, 2004 at 11:35 UTC ( #377259=note: print w/replies, xml ) Need Help??


in reply to Combining Ultra-Dynamic Files to Avoid Clustering (Ideas?)

Rather than writing 1,000,000 files x 4096-bytes, turn the problem around.

Write 1024 files x 4,000,000 bytes.

The 'file' number x 4 becomes the offset into the file. The 'file position', becomes the file number.

This addresses both the > 2 GB problem and the 'maximum filesize assumption' problem.

Many of the 1024 files would be sparsly populated, but from what I read, XFS and reiserFS support this for Linux and placing the files in a compressed directory would deal with that on Win32.


Examine what is said, not who speaks.
"Efficiency is intelligent laziness." -David Dunham
"Think for yourself!" - Abigail
"Memory, processor, disk in that order on the hardware side. Algorithm, algoritm, algorithm on the code side." - tachyon
  • Comment on Re: Combining Ultra-Dynamic Files to Avoid Clustering (A better way?)

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://377259]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others avoiding work at the Monastery: (3)
As of 2021-10-18 15:27 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    My first memorable Perl project was:







    Results (74 votes). Check out past polls.

    Notices?