Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl Monk, Perl Meditation
 
PerlMonks  

Re: Reducing memory footprint when doing a lookup of millions of coordinates

by mellon85 (Monk)
on Feb 27, 2011 at 10:24 UTC ( #890394=note: print w/ replies, xml ) Need Help??


in reply to Reducing memory footprint when doing a lookup of millions of coordinates

Never considered using a database instead of loading it all in memory? it may not be as fast as it is now (except for the loading time) but the memory footprint IMHO can be heavily reduced


Comment on Re: Reducing memory footprint when doing a lookup of millions of coordinates
Re^2: Reducing memory footprint when doing a lookup of millions of coordinates
by richardwfrancis (Beadle) on Feb 27, 2011 at 12:15 UTC
    Hi mellon85 and moritz,

    Thank you for your replies.

    I definitely like the idea of the database. I think this will work very well in the context of the rest of the program I'm writing.

    I'm thinking I could distribute the SQLite database file, preloaded with the feature data and indexed, with the software and query it when needed!

    Many thanks,

    Rich

      The database is a really good idea. I doubt that SQLite keeps the whole database file in memory itself. Could even locate the file to ram disk or similar as a speed bonus.

      You can still use the same code though to do the lookups (with minor modifications):

      1. Get a unique list of all chr.
      2. Foreach chr retrieve all entires (using retrieve_hashref)
      3. use existing code (but work on hashrefs).

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://890394]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others contemplating the Monastery: (4)
As of 2014-08-31 05:37 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    The best computer themed movie is:











    Results (294 votes), past polls