http://www.perlmonks.org?node_id=1055120


in reply to Re^2: DBI::SQLite slowness
in thread DBI::SQLite slowness

Well, 200 millions records at a rate of 2000 per second, that's still 100,000 seconds, or almost 28 hours. That's still pretty long, isn't-it? Having said that, you may be able to live with that, a full day of processing is still manageable for a number of cases. Beware, though, that the rate might slow down as you database grows larger.

If you are really only looking for filtering out duplicates, the ideas discussed by BrowserUk are probably much better than using a database.