Beefy Boxes and Bandwidth Generously Provided by pair Networks
P is for Practical
 
PerlMonks  

Re: How do I read from a compressed SQLite FTS4 database with DBD::SQLite?

by Corion (Pope)
on Nov 29, 2013 at 14:47 UTC ( #1064967=note: print w/ replies, xml ) Need Help??


in reply to How do I read from a compressed SQLite FTS4 database with DBD::SQLite?

When you pass your de/compression callbacks to SQLite, did you mean to pass references?

... $dbh->sqlite_create_function( 'compressme', 1, sub { compressor } ); $dbh->sqlite_create_function( 'uncompressme', 1, sub { uncompressor } +); ...

As written, the functions will never pass any arguments through to compressor resp. uncompressor.

Did you mean the following?

$dbh->sqlite_create_function( 'compressme', 1, \&compressor ); $dbh->sqlite_create_function( 'uncompressme', 1, \&uncompressor );


Comment on Re: How do I read from a compressed SQLite FTS4 database with DBD::SQLite?
Select or Download Code
Re^2: How do I read from a compressed SQLite FTS4 database with DBD::SQLite?
by elef (Friar) on Nov 29, 2013 at 15:36 UTC
    Thank you, that fixes it. It runs correctly with ASCII text now.
    I did get another problem though: a bunch of SQL logic errors on INSERT, which seem to be caused by the fact that IO::Compress::Gzip can't handle non-ASCII text, at least not in the way I've been using it.
    So, is there some flag I can set to make it handle UTF-8 text? Or should I use some other compression method? All the compression-related perl modules I found seemed to be designed to work with files; IO::Compress::Gzip was the only one that I could find that offers a simple way to compress strings.
    Also, my test db created with compression is much larger and slower than the same test db without compression, which is not exactly what I was hoping for...

      I would assume that you should explicitly encode your text to UTF-8 on compressing and explicitly decode from UTF-8 on decompressing. IO::Compress::Gzip likely only works on octets and expects octets. I wonder why it doesn't scream bloody murder...

      Also, maybe you need to explicitly set sqlite_unicode if you are reading/storing UTF-8 data in SQLite.

        Thanks. I did have sqlite_unicode => 1, and the db worked with non-ASCII text if I didn't try to compress it.

        This seems to fix the problem:
        sub compressor { my $in = shift; $in = encode ('utf8', $in); my $out; gzip \$in => \$out; return ($out); } sub uncompressor { my $in = shift; my $out; gunzip \$in => \$out; return (decode ('utf8', $out)); }


        I tested it with some real-life sample data and the compression isn't doing too well: the source data is a 9.4MB text file that compresses down to a 2.4MB zip file. When I imort it without compression, I get a 19.7MB db file. With compression, the db file is 17.0MB. That's a little smaller than the original but not enough to make it worth it. I was hoping for something in the 10MB range (~50% compression). I imagine it could be because each string is compressed separately so repeated strings or parts of strings can't be exploited during compression. Is this a lost battle? If not, I would be grateful for suggestions on a better algorithm.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://1064967]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others chilling in the Monastery: (17)
As of 2014-07-23 14:01 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    My favorite superfluous repetitious redundant duplicative phrase is:









    Results (144 votes), past polls