Beefy Boxes and Bandwidth Generously Provided by pair Networks
Problems? Is your data what you think it is?
 
PerlMonks  

Re^3: Combining Ultra-Dynamic Files to Avoid Clustering (Ideas?)( A DB won't help)

by bgreenlee (Friar)
on Jul 24, 2004 at 17:36 UTC ( #377146=note: print w/replies, xml ) Need Help??


in reply to Re^2: Combining Ultra-Dynamic Files to Avoid Clustering (Ideas?)( A DB won't help)
in thread Combining Ultra-Dynamic Files to Avoid Clustering (Ideas?)

A 2GB filesize limit is definitely a problem with the big file approach. Two possible ways to avoid this if you still want to go this way:
- the obvious: split the big file up into n files. This would also make the "growing" operation less expensive
- if some some subfiles aren't growing very much at all, you could actually decrease the size allocated to them at the same time you do the grow operation.

Actually, if you wanted to get really spiffy, you could have it automatically split the big file in half when it hits some threshold...then split any sub-big files as they hit the threshold, etc...

BerkeleyDB is definitely sounding easier...but I still think this would be a lot of fun to write! (Might be a good Meditation topic...there are times when you might want to just DIY because it would be fun and/or a good learning experience.)

Brad

  • Comment on Re^3: Combining Ultra-Dynamic Files to Avoid Clustering (Ideas?)( A DB won't help)

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://377146]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others wandering the Monastery: (4)
As of 2021-10-24 16:43 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    My first memorable Perl project was:







    Results (89 votes). Check out past polls.

    Notices?