Beefy Boxes and Bandwidth Generously Provided by pair Networks
Do you know where your variables are?
 
PerlMonks  

Difficulty inserting a large object using DBD::Pg

by kudra (Vicar)
on Jun 29, 2000 at 15:46 UTC ( #20355=perlquestion: print w/ replies, xml ) Need Help??
kudra has asked for the wisdom of the Perl Monks concerning the following question:

I am trying to insert a large object in a PostgreSQL database, but am having difficulties finding a graceful way to do this. Note that the included code has been simplified to avoid distractions--I am well aware that I should be addressing possible errors.

System information:

  • perl, version 5.005_03
  • Linux 2.2.14 i686
  • Postgres: 6.5.3
  • DBI: 1.1.3
  • DBD Pg: 0.93

I have created a table with one field, data, which is of type OID. Next I attempt to insert by more or less copying the code on page 153 of 'Programming the Perl DBI':

my $sth = $dbh->prepare("insert into foo(data) values (?);"); $sth->bind_param(1, $value, SQL_LONGVARBINARY); $sth->execute();
I receive a warning that binding the value isn't permitted:
SQL type 1075548612 for 'rc' is not fully supported, bound as VARCHAR +instead at ./1.0 line 122.
Whereupon it fails with a pg_atoi error.

I can insert by writing $value to a file and using the (presumably Postgres-specific) function lo_import:

$dbh->prepare("insert into foo(data) values (lo_import('/tmp/lame'));")
I'm not pleased with this solution because it requires the creation of a temporary file and it seems to undermine the use of DBI to use a database-specific insert.

I imagine some of the problem has to do with the unusual way Postgres deals with large objects. I expected that the Pg DBD would compensate for that, perhaps using lo_import, but accepting the format shown earlier.

The fact that I can't seem to find any documentation mentioning this problem (which I imagine must have occurred before) makes me feel as if I've completely failed to grasp something simple (or that Pg's documentation-dearth is contagious, affecting anything that touches Postgres). Is there a better way to insert a large object in Postgres using DBI? Thanks in advance.

Comment on Difficulty inserting a large object using DBD::Pg
Select or Download Code
Re: Difficulty inserting a large object using DBD::Pg
by httptech (Chaplain) on Jun 29, 2000 at 16:13 UTC
    Do you get an error when just using something like: $dbh->do("insert into foo(data) values (?)",undef,$value);
    By the way, the semicolon after (?) would cause an SQL error in DBD::mysql; does that actually work under DBD::pg?
Re: Difficulty inserting a large object using DBD::Pg (kudra: do doesn't fix)
by kudra (Vicar) on Jun 29, 2000 at 17:50 UTC
    I've never had a problem with the semicolon, but leaving it out seems to make no difference.

    Yes, I do get an error with what you suggested. Here's what I tried:

    $value = "foo"; # test value $dbh->do("insert into foo(id, test) values (6,?)",undef,$value);
    It also gives a pg_atoi error for dbh->errstr:
    ERROR: pg_atoi: error in "foo": can't parse "foo"
      Why don't you try this. Make sure the database table is the type you wish to use and then just use
      $sth->bind_param(1,"$value")
      this should place the value of $value into the database field as long as the database field is of a type to handle that input.
        This does not work because it attempts to resolve the type itself, and defaults to varchar, as it did when I explicitly selected the type. It's another pg_atoi error.
Re: Difficulty inserting a large object using DBD::Pg
by httptech (Chaplain) on Jun 29, 2000 at 18:36 UTC
    I don't know PostgreSQL quite as well as MySQL, but shouldn't an OID be an integer?
      OID isn't exactly an integer (if I recall correctly), it's more of a pointer to the data which is automatically generated (for each tuple, as well as for an OID field).

      But I gave it a try, using my original code but specifying SQL_INTEGER instead, and got the following error:

      ERROR: Attribute blah not found
      'blah' is the value of $value, btw.
Re: Difficulty inserting a large object using DBD::Pg
by httptech (Chaplain) on Jun 29, 2000 at 19:11 UTC
    I found this when reading the documentation on the PostgreSQL site:
    A column of type OID is similar to an INTEGER column, but defining it as type OID documents that the column holds OID values.

    The second snippet you showed using "do" works when $value is an integer. Also since, "atoi error" points to a problem while converting ASCII to integer, I think the evidence points to OIDs being integer-only.

      I'm not sure what you're suggesting. Is it:
      my $value = "111"; $dbh->do("insert into foo(id, test) values (8, ?)", undef, $value);
      Certainly that will allow an insert, but then 111 becomes the value of the column, not an OID. Exporting it therefore doesn't work:
      foobar=> select lo_export('test', '/tmp/x') from foo where id=8; ERROR: pg_atoi: error in "test": can't parse "test"
      If the OID is a valid number, I imagine it will work, but then I still have to insert the data and get an OID, which puts me back to using lo_import. Or did I misunderstand your suggestion?
Re: Difficulty inserting a large object using DBD::Pg
by httptech (Chaplain) on Jun 29, 2000 at 19:44 UTC
    I think I am lost on what you are trying to do. As I understand it, there is the OID which PostgreSQL automatically assigns to each row; and the OID type which you can assign to a column. But the OID you create yourself does not act like an OID, it's just an integer that is supposed to hold OID numbers from another table.
      I think I'm getting a bit lost, too--that last post reads like a children's song: "OID kudra had a db, OI-OI-D!"

      Yes, there is an OID which Postgre assigns to each row.
      Yes, there is also an OID type which you can assign to a column.

      If you look in a table containing an OID column, you will see a number which was created by pg, which lets it know how to find the large object data. You don't create the number, you just give it the data and it uses the number for itself. My problem is with inserting the data.

      Update: Just a few details from various chatterbox discussions:
      I'm using OID because that's the postgres implementation of BLOB. I can't use TEXT, which is limited by pagesize (as far as I know, and I have looked).

Re: Difficulty inserting a large object using DBD::Pg
by httptech (Chaplain) on Jun 29, 2000 at 20:38 UTC
    Ok, reading the PostgreSQL documentation on blobs helped :) From what I can tell, there is no way to write a blob directly; from a scalar; instead you have to use the lo_import and lo_export methods with disk-based files, the way you are doing it now. As much as MySQL gets picked on for not being a "real" RDBMS, it seems a lot better suited for what you want to do here than PostgreSQL.
      We're back where we started! Thanks for giving it a whirl, httptech.

      I'm not especially satisfied with pg's handling of large objects, but I want transactions for this, so MySQL isn't the way to go. I guess I'll just go with the pg-specific route and hope that DBD::Pg 1.0 will offer a way to fake normal lob handling.

(kudra: update) Difficulty inserting a large object using DBD::Pg
by kudra (Vicar) on Aug 25, 2000 at 13:48 UTC
    In case anyone else has experienced this problem as well, here's an update of what I've learned since posting the question.
    I spoke with H. Merijn Brand (author of DBD::Unify) at the July Amsterdam.pm meeting to see if he had any insights. According to him (and if this is wrong I probably am remembering it incorrectly), large object handling probably isn't implemented in the Postgres DBD because DBI currently lacks the specifications for doing so, which is the reason he did not implement large objects in his own driver. According to the DBI FAQ section 5.2, the answer to 'How do I handle BLOB data with DBI' is:
    If the BLOB can fit into your memory then setting the LongReadLen attribute to a large enough value should be sufficient.

    If not, ... To be written.

    Postgres of course doesn't implement LongReadLen because of how it handles large objects. According to the DBI book ('Programming the Perl DBI' by Alligator Descartes and Tim Bunce, O'Reilly, ISBN 1-56592-699-4), the only method implemented by DBD::Pg is blob_read(), which is 'undocumented' (DBD::Pg has the following to say about it: 'Supported by this driver as proposed by DBI. Implemented by DBI but not documented, so this method might change.').

    Based upon all this, I decided to use DBD::Pg's driver specific functions (lo_creat, lo_open, lo_import, etc) until something else is available.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://20355]
Approved by root
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others studying the Monastery: (5)
As of 2014-12-18 04:14 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    Is guessing a good strategy for surviving in the IT business?





    Results (41 votes), past polls