Beefy Boxes and Bandwidth Generously Provided by pair Networks
Pathologically Eclectic Rubbish Lister
 
PerlMonks  

Re: Storing/Retrieving images as blobs

by chrestomanci (Priest)
on Sep 26, 2011 at 09:43 UTC ( #927833=note: print w/ replies, xml ) Need Help??


in reply to Storing/Retrieving images as blobs

I don't think you will see performance problems as your table fills up with images, because the number of images stored will be relatively low, and your database will not be trying to index the binary data in the blobs.

If you had a normal 3Gb table where the cells contained text, numbers or foreign keys, then there might easily be 100 million rows, the database would be maintaining indexes and foreign key relationships on all those rows, which would be a lot of work. In your case your 3Gb table probably only contains 150 thousand rows, which is fairly small, so not much work to index.

I think the total size of the data will only become a problem when it becomes an issue for the underlying file system that your database uses to store it's data, so if it fills the disc to the point that it becomes heavily fragmented or suchlike then there will be a problem, but you would have had those problems if you where storing the files on disc anyway.

Having said that, I don't think you should be storing the image files in your database, because by doing so you will make the database a lot larger which makes it harder to backup, restore or run in a cluster. The problem is that it is generally hard to incrementally backup a database without deep knowledge of it's schema, so most backup will be full backups. On the other hand it is easy to incrementally backup a directory full of files, so I think your best strategy long term would be to keep the binary data out of the database, and so keep it small, separately store your images on the file-system, and develop a backup procedure that backs up both the database and the file-system for the images. Seeing as everyone needs to backup file-systems it should be very easy to find a suitable tool to backup that part.

Contrary to what dHarry said, I don't think a huge number of files in one directory will hurt file system performance, as modern Linux file-systems use modern data structures to store the list of files in a directory and can easily cope with thousands of files per directory, however you should probably split things up anyway for your own sanity, because while the file system will cope fine with 100_000 files in a directory, ls, or worse a GUI file browser will not cope so well.


Comment on Re: Storing/Retrieving images as blobs
Re^2: Storing/Retrieving images as blobs
by dHarry (Abbot) on Sep 26, 2011 at 11:38 UTC

    Maybe I should have clarified what I mean with "huge", obviously a few thousand is not huge. I am more thinking about millions of small files. Normally OS's don't like directories with so many files. (The OP doesn't have millions of files though.) It's nice that some OS's have better support for this nowadays but the OP didn't specify his OS/version, maybe he is using and old Unix/Windows OS? Chopping things up in subdirectories will also improve search/access times.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://927833]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others scrutinizing the Monastery: (10)
As of 2014-07-25 09:29 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    My favorite superfluous repetitious redundant duplicative phrase is:









    Results (170 votes), past polls