http://www.perlmonks.org?node_id=651120


in reply to DBD::CSV eats memory

A third option would be DBM::Deep which is designed to handle a million rows. There's no SQL interface to it ... you're more than welcome to build one. I'll even help. :-)

My criteria for good software:
  1. Does it work?
  2. Can someone else come in, make a change, and be reasonably certain no bugs were introduced?

Replies are listed 'Best First'.
Re^2: DBD::CSV eats memory
by jZed (Prior) on Nov 16, 2007 at 04:08 UTC
    I had DBM::Deep working with my DBD::DBM at one point but I dropped the ball on it. I think it would be fairly trivial to make them work together. I don't have the tuits to do more than offer suggestions, but if you or someone else wants to patch DBD::DBM, I'd be glad to kibbutz and apply the patches.
      If you want to go ahead and describe what needs done, I'd be willing to take a look at it.

      My criteria for good software:
      1. Does it work?
      2. Can someone else come in, make a change, and be reasonably certain no bugs were introduced?
        I think if you can make connect() work (i.e. get the right tie and lock), everything else should pretty much just work. Though it will probably not make use of many of your optimizations, it should at least support SQL access to DBM::Deep data.

      Double thank :-)

Re^2: DBD::CSV eats memory
by perlmonkdr (Beadle) on Nov 16, 2007 at 14:26 UTC

    Right now, I'm need to use only a driver of DBI for compatibility reason.

    Thank a lot