Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl-Sensitive Sunglasses
 
PerlMonks  

OT - database recovery - Re: Handling huge BLOBs

by gmax (Abbot)
on Mar 08, 2002 at 15:56 UTC ( #150347=note: print w/ replies, xml ) Need Help??


in reply to Re: Re: Handling huge BLOB fields with DBI and MySQL
in thread Handling huge BLOB fields with DBI and MySQL

Recovery of a database could be as easy as running your latest backup and restart business, if you are well organized.

If you are using binary logs, the system can recover fairly easily. BLOBs are not a problem here, they are just more data in your database.

About organizing yourself, you might have noticed that I added a timestamp field to my table. This way, I can have a progressive backup of the fields that were modified in a given timeframe, to integrate with a full weekly backup.
The subject deservers more space than we can dedicate here. The matter is explained much better than this in Paul Dubois' book, MySQL.

Personally, I would say that storing blobs in sparse files makes your task more difficult, but TMTOWTDI, after all, and I might be wrong. Let's say that I am just more confortable with my current architecture.

 _  _ _  _  
(_|| | |(_|><
 _|   


Comment on OT - database recovery - Re: Handling huge BLOBs

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://150347]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others surveying the Monastery: (14)
As of 2014-10-21 13:48 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    For retirement, I am banking on:










    Results (103 votes), past polls