in reply to Combining Ultra-Dynamic Files to Avoid Clustering (Ideas?)

I would use a database like SQLite instead of a multi-megabyte flat file punished with random access.

You cannot insert data into the middle of a flat file. You can allocate a gigantic file and pre-subdivide it into fixed-length records of sufficient size that you'll never fill one up completely, but that's tricky and not scalable. You could use Tie::File to treat the file as an array, but doing massive amounts of mid-array inserts is very slow with a tied array, because again, it's really just working on a flat file behind the scenes.

This really is a problem best delt with via a database. I'm not positive SQLite is the best one for the job, but it is pretty easy to install, self-contained, and stores all of its data in one file.