|Perl: the Markov chain saw|
Re: Handling huge BLOB fields with DBI and MySQLby mpeppler (Vicar)
|on Mar 08, 2002 at 19:48 UTC||Need Help??|
Interesting way to handle this sort of thing within the limits of your database server.
Personally I'm with lachoy here - I avoid storing BLOBs in the database because the interface to fetch/store them is often convoluted.
On the other hand storing everything in the database potentially simplifies the administration of your data, so you probably need to balance one against the other.
As usual I'll now give you the Sybase POV :-)
Sybase stores BLOBs in IMAGE or TEXT columns. These are variable size datatypes that can store up to 4GB, assuming the database is large enough, but they are not limited by the database device size or filesystem limitations of the underlying OS.
IMAGE/TEXT columns can be written and fetched from using normal SQL statements - however as with MySQL the server sets default limits on the amount of data that can be transfered in a single operation (see the TEXTSIZE option).
To circumvent this limit (or to insert really large BLOBs) you can use special API calls to store/fetch the data in chunks. This API is available in DBD::Sybase, Sybase::CTlib and Sybase::DBlib, but it's a little convoluted (especially for writes, where you first have to fetch a "text pointer" and then call the write function with this pointer).
For example, with DBD::Sybase you'd do something like this (taken from the DBD::Sybase man page):
Like I said - the API is pretty convoluted...