http://www.perlmonks.org?node_id=890394


in reply to Reducing memory footprint when doing a lookup of millions of coordinates

Never considered using a database instead of loading it all in memory? it may not be as fast as it is now (except for the loading time) but the memory footprint IMHO can be heavily reduced

  • Comment on Re: Reducing memory footprint when doing a lookup of millions of coordinates

Replies are listed 'Best First'.
Re^2: Reducing memory footprint when doing a lookup of millions of coordinates
by richardwfrancis (Beadle) on Feb 27, 2011 at 12:15 UTC
    Hi mellon85 and moritz,

    Thank you for your replies.

    I definitely like the idea of the database. I think this will work very well in the context of the rest of the program I'm writing.

    I'm thinking I could distribute the SQLite database file, preloaded with the feature data and indexed, with the software and query it when needed!

    Many thanks,

    Rich

      The database is a really good idea. I doubt that SQLite keeps the whole database file in memory itself. Could even locate the file to ram disk or similar as a speed bonus.

      You can still use the same code though to do the lookups (with minor modifications):

      1. Get a unique list of all chr.
      2. Foreach chr retrieve all entires (using retrieve_hashref)
      3. use existing code (but work on hashrefs).