Beefy Boxes and Bandwidth Generously Provided by pair Networks
Keep It Simple, Stupid

Re^3: Using DBIx::Class::Schema::Loader to determine constraints

by pokki (Monk)
on Oct 31, 2012 at 22:43 UTC ( #1001755=note: print w/replies, xml ) Need Help??

in reply to Re^2: Using DBIx::Class::Schema::Loader to determine constraints
in thread Using DBIx::Class::Schema::Loader to determine constraints

(snip) It has been suggested that I try DBIx::Class::Schema::Loader. As I can see, SQL::Translator supports all of the DBMSs I wanted to support, so I am fine with SQL::Translator, too.

That would be because DBICSL uses SQL::Translator behind the scenes, as far as I can tell. Using SQL::Translator directly is just cutting out the middle man.

What I could not get from the documentation is in which form the dump of the database structure which is passed as second parameter to parse() is expected (for each supported DBMS). Do you know the expected format?

The parse() function expects the raw DDL as a string. In my example, data.sql is just your DDL as is:

CREATE TABLE a (u integer, PRIMARY KEY (u)); CREATE TABLE t (i integer, j integer, u integer, Primary key (i,j), co +nstraint fkey1 foreign key (u) references a (u)); CREATE TABLE s (j integer, r1 integer, r2 integer, constraint fkey for +eign key (r1,r2) references t (i,j));

The various SQL::Translator::Parser::$FOOPARSER classes document individually what grammar they support. Sometimes a few bits are missing, but the author and the community are rather responsive and it's not too hard to patch it yourself if you know the basics of writing a grammar. "Normal" MySQL syntax is a problem, though. The ANSI compatibility mode of mysqldump seems to produce parser-friendlier output, but even then I usually have to tweak it until the parser accepts it.

At $work I maintain an SQLite schema as the reference DDL for a database ("if SQLite supports it, everything supports it -- and if they don't they have no excuse"). DBICSL (through SQL::Translator) is pretty good at turning it into a DBIC schema which then produces MySQL-compatible DDL for deployment (and diff files for upgrading). It even knows about a few things not encoded in the grammar itself, e.g. old versions of MySQL don't support VARCHAR columns with a size larger than 255, so VARCHAR(500) in the DDL would eventually turn into a TEXT column. So far everything I've thrown at it, constraints, keys, cascade actions have been supported (cascade action support is recent though). I've only ever had trouble with triggers, and had to work around this by providing custom code for each supported DBMS.

Replies are listed 'Best First'.
Re^4: Using DBIx::Class::Schema::Loader to determine constraints
by jds17 (Pilgrim) on Oct 31, 2012 at 23:53 UTC

    Thank you for the clarification! SQL::Translator looks really impressive, but I fear that it is nontrivial to create the dumps for each DBMS (db user rights?) and that it may also not be stable enough to use SQL::Translator on those dumps for my purpose.

    Recently, I have put a module on CPAN (DBIx::Table::TestDataGenerator). Currently, some methods related to finding out database metadata related to constraints have been abstracted into a role and for each DBMS I want to support the plan was to write a class impersonating that role. This gives me a good control over how the metadata is queried and I can even handle relevant DBMS version differences.

    I would not want the code to crash because a dump cannot be created or read. On the other hand I don't want to reinvent the wheel and I acknowledge that it does not look too elegant handling each DBMS separately (although one learns a lot while doing this), maybe you can still convince me that using SQL::Translator is better.

      From what I can see, you're currently using the same kind of tricks (table_info) as the SQL::Translator::Parser::DBI::$Driver modules. These have also worked for me in the past. Basically instead of passing them DDL in a big string, you give them a DSN, username and password (an open database handle works too). Then SQL::Translator::Parser::DBI selects the appropriate $Driver, which goes and fetches the metadata directly from the database.

      If you have enough permissions to use table_info, you have enough permissions to use the DBI parsers.

      The doc for the DBI parser says that Oracle is not supported and Pg support is experimental, but if you look at the module list for the distribution there *is* a DBI::Oracle class, so the doc is probably out of date. I'd say, install SQL::Translator, play around on real databases, see if it fits your needs.

        I have just tried out what you wrote in your last post and I like the idea of doing it this way. To be more specific what I tried out was:

        use strict; use warnings; use DBI; use SQL::Translator; my $dbh = DBI->connect( 'dbi:Pg:dbname=playground', 'postgres', '***** +**' ); my $translator = SQL::Translator->new( parser => 'DBI', parser_args => { dbh => $dbh, } ); $translator->translate(); my $schema = $translator->schema; for my $rsrc_name ( $schema->get_tables() ) { my $rsrc = $schema->get_table($rsrc_name); ... } ...

        The Pg support indeed is not complete, I had tested with a table having a two-column primary key and the parse() method of SQL::Translator::Parser::DBI::PostgreSQL failed (currently it only works on one-column pkeys).

        I have posted a bug report and a proposal for a patch on (see here). The bug was easy to fix. So one possibility would be to continue with my database-wise handling and the other to help out finding and maybe help fixing remaining bugs in the SQL::Translator::Parser::DBI::XXXXX modules.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://1001755]
and all is quiet...

How do I use this? | Other CB clients
Other Users?
Others taking refuge in the Monastery: (5)
As of 2018-06-19 02:09 GMT
Find Nodes?
    Voting Booth?
    Should cpanminus be part of the standard Perl release?

    Results (111 votes). Check out past polls.