Beefy Boxes and Bandwidth Generously Provided by pair Networks
No such thing as a small change
 
PerlMonks  

Re^3: speeding up row by row lookup in a large db

by perrin (Chancellor)
on Mar 21, 2009 at 19:19 UTC ( [id://752281]=note: print w/replies, xml ) Need Help??


in reply to Re^2: speeding up row by row lookup in a large db
in thread speeding up row by row lookup in a large db

Well, in your question you said the whole database was 430MB, so you can see why I would suggest loading it into RAM. Perl should be able to access more than 2GB RAM on a 64-bit machine, and to some extent on 32-bit one if you have the right Linux kernel.

INSERTs will definitely run faster if you only commit every 1000. There may be some other SQLite tuning tricks, which you'd probably find on a mailing list or wiki devoted to SQLite. But if none of those work for you, I think MySQL is your best bet.

  • Comment on Re^3: speeding up row by row lookup in a large db

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://752281]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others taking refuge in the Monastery: (4)
As of 2024-04-19 02:19 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found