http://www.perlmonks.org?node_id=150408


in reply to Handling huge BLOB fields with DBI and MySQL

Interesting way to handle this sort of thing within the limits of your database server.

Personally I'm with lachoy here - I avoid storing BLOBs in the database because the interface to fetch/store them is often convoluted.

On the other hand storing everything in the database potentially simplifies the administration of your data, so you probably need to balance one against the other.

As usual I'll now give you the Sybase POV :-)

Sybase stores BLOBs in IMAGE or TEXT columns. These are variable size datatypes that can store up to 4GB, assuming the database is large enough, but they are not limited by the database device size or filesystem limitations of the underlying OS.

IMAGE/TEXT columns can be written and fetched from using normal SQL statements - however as with MySQL the server sets default limits on the amount of data that can be transfered in a single operation (see the TEXTSIZE option).

To circumvent this limit (or to insert really large BLOBs) you can use special API calls to store/fetch the data in chunks. This API is available in DBD::Sybase, Sybase::CTlib and Sybase::DBlib, but it's a little convoluted (especially for writes, where you first have to fetch a "text pointer" and then call the write function with this pointer).

For example, with DBD::Sybase you'd do something like this (taken from the DBD::Sybase man page):

# update a database entry with a new version of a file: my $size = -s $file; # first we need to find the CS_IODESC data for the data $sth = $dbh->prepare("select img from imgtable where id = 1"); $sth->execute; while($sth->fetch) { # don't care about the data! $sth->func('CS_GET', 1, 'ct_data_info'); } # OK - we have the CS_IODESC values, so do the update: $sth->func('ct_prepare_send'); # Set the size of the new data item (that we are inserting), and +make # the operation unlogged $sth->func('CS_SET', 1, {total_txtlen => $size, log_on_update => +0}, 'ct_data_info'); # open the file, and store it in the db in 1024 byte chunks. open(IN, $file) || die "Can't open $file: $!"; while($size) { $to_read = $size > 1024 ? 1024 : $size; $bytesread = read(IN, $buff, $to_read); $size -= $bytesread; $sth->func($buff, $bytesread, 'ct_send_data'); } close(IN); # commit the operation $sth->func('ct_finish_send');

Like I said - the API is pretty convoluted...

Michael