elef has asked for the
wisdom of the Perl Monks concerning the following question:
Dear fellow monks, I'm looking for some general guidance on how feasible a project idea of mine is.
What the software would need to do is: do lookups in a hell of a lot of text data and display the hits in a GUI. Fuzzy matching is not essential, but I would need string (word) search, exact term search "all of these words in any order or position" search, "includes this string but doesn't include this string" and similar things. There are ~15 million records, each record containing a couple of hundred UTF-8 characters in 4 or 5 fields. The data takes up about 8GB in a tab separated text file. This wouldn't be a web service, the software would need to be able to run on any random windows computer.
So, how difficult would this be for a relative novice to write? I have a reasonable handle on perl itself, and I could write the GUI in Tk without too much trouble, but I don't know how involved the database stuff would be. I have exactly zero experience with databases. What sort of performance can I expect from whatever database engine I would end up using? How much time would it take to import 8GB of text into a database format and how much space would it take up? Most importantly, how much time would a lookup take on a run-of-the-mill laptop? Could the whole app be packaged up into a reasonably-sized .exe file with PAR::Packer? (excluding the actual data, of course)
Thanks for any help. Also, if there already is an open source lookup tool out there that I could adapt for the purpose, please let me know. Perhaps even LibreOffice Base could be used?