http://www.perlmonks.org?node_id=1180363

aplonis has asked for the wisdom of the Perl Monks concerning the following question:

So in SQLite you can do a .read to run all the SQL commands inside of a *.sql file. So in effect, you can drop and build temporary tables, populate them and finish with a SELECT at the end.

So... After a .read foo.sql via the CLI in from SQLite, I get back rows which are the result of only just that final SELECT.

It would vastly simplify my Perl if I could issue such a .read foo.sql via DBI and so receive back only just that final SELECT just as fetchrowarray() would get it. This rather than having to separately via DBI do all the table building and populating.

This being my wish, is there any way, via DBI, to accomplish the equivalent of a fetchrowarray() but with .read foo.sql as the SQL command?

Replies are listed 'Best First'.
Re: DBI and SQLite's .read feature
by NetWallah (Canon) on Jan 26, 2017 at 05:48 UTC
    I do not see a way to ".read" a file via the DBI or DBD::SQLite.

    That said, it is probably not a good idea to try to process commands intended for the specific sqlite code file - because these commands can include several "." commands, including ".pause" , ".print", and various formatting directives that would make no sense to DBI.

    You should consider separating your query from database creation.

    If you still want to pursue this angle, there are a few (not recommended) options:

    1) Use system calls or backticks to run sqlite externally, to process the .sql file.

    2) Parse and extract the SELECT statement from the .sql file, and pass it into DBI.

            ...it is unhealthy to remain near things that are in the process of blowing up.     man page for WARP, by Larry Wall

Re: DBI and SQLite's .read feature
by Corion (Patriarch) on Jan 26, 2017 at 07:50 UTC

    I wrote DBIx::RunSQL for running SQL statements from a file, especially to populate databases. Maybe it helps in your case as well.

    Maybe you want to look at Querylet as well.

Re: DBI and SQLite's .read feature (updated)
by haukex (Archbishop) on Jan 26, 2017 at 08:53 UTC

    Hi aplonis,

    It's possible to call sqlite3 via system. However, I would strongly recommend that you do all of your SELECTing via DBI, because otherwise you're left parsing the output of the CLI tool. However, I have in the past done something similar to the following, you could change this to .read:

    system('sqlite3', $DB_FILE, '.dump')==0 or die "sqlite3 failed, \$?=$?";

    Update: Added the first sentence above; also I found this in the DBD::SQLite docs:

    Processing Multiple Statements At A Time

    DBI's statement handle is not supposed to process multiple statements at a time. So if you pass a string that contains multiple statements (a dump) to a statement handle (via prepare or do), DBD::SQLite only processes the first statement, and discards the rest.

    If you need to process multiple statements at a time, set a sqlite_allow_multiple_statements attribute of a database handle to true when you connect to a database, and do method takes care of the rest (since 1.30_01, and without creating DBI's statement handles internally since 1.47_01). If you do need to use prepare or prepare_cached (which I don't recommend in this case, because typically there's no placeholder nor reusable part in a dump), you can look at $sth->{sqlite_unprepared_statements} to retrieve what's left, though it usually contains nothing but white spaces.

    Hope this helps,
    -- Hauke D

      Yes, excellent! This worked quite well indeed.

      system('sqlite3', $DB_FILE, '.read filename.sql')==0 or die "sqlite3 failed, \$?=$?";

      The reason I wanted to do it this way is that the *.sql in question updates one table by summarizing plural columns from a plurality of other tables, building a number of temporary tables in the process. There are four such *.sql files for similar summaries, each one a whole page of SQL code, which itself never changes, and which I wished to keep from complicating my Perl.

      The final query, to be issued from Perl, pulls already-summarized rows from the now easily updatable table. Thank you so much!

Re: DBI and SQLite's .read feature
by Marshall (Canon) on Jan 26, 2017 at 10:19 UTC
    I would like to give you an SQLite hint that I have found very useful.

    For Perl, SQLite comes pre-installed as part of my Active State distribution. A simple use DBI; is all I need in Perl code. I have installed and used the mySQL command line tool because I had a mySQL server running on a now defunct machine.

    For SQLite, I use a firefox plug-in, SQL Lite Manager. Thing is not 'perfect', but it is impressive and I recommend it. I haven't seen the need yet to install the SQLite command line tool.
    Edit: I just tried command "sqlite3" and it worked, much to my surprise. I guess previously I had only tried "sqlite". In any event, try the firefox SQLite GUI.

Re: DBI and SQLite's .read feature
by Marshall (Canon) on Jan 26, 2017 at 09:46 UTC
    You wrote:
    So in SQLite you can do a .read to run all the SQL commands inside of a *.sql file. So in effect, you can drop and build temporary tables, populate them and finish with a SELECT at the end.
    From the SQLite command line tool:
    sqlite> .read create.sql

    In Perl, it is completely possible to issue this: '.read create.sql' command to the SQLite tool.

    I don't know where this 'create.sql' file comes from and I don't know where this "final SQL SELECT" statement comes from?

    I would run the .sql file (either manually or via Perl). Then use Perl to scan that .sql file for CREATE TABLE statements, extract the table names and then run via DBI SELECT * from TABLE x statements. What am I missing here?

      You don't even need to scan the incoming SQL file for CREATE TABLE statements. The DBI $dbh->table_info( method will give you all tables in a database, and for an SQLite database, most of these tables will inevitably belong to one of the SQL files.

        Your post is completely correct.

        I am not sure what the OP's "real problem" is. A better explanation of the "work flow" would help a lot.