halfcountplus has asked for the
wisdom of the Perl Monks concerning the following question:
I am working on a site-search for someone that allows perl regexp searches. The site is not that big (10 gb) but it is still too much to actually be parsing all the pages for every search, so I have to extract all the content and put it in some kind of database.
My first thought thought was to use SQL, which I presume that will work fine. I am positive that the "expense" of the operation is not the volume of data searched -- it's having to parse through the directory tree opening and closing hundreds of files. One giant SQL table will probably be much faster.
HOWEVER, there will not be much further purpose to using an SQL db, since I want to use the perl regexp engine on the data, not SQL type queries (ie, I would just being going straight thru the table pulling all data anyway).
So, I could just put ALL the text into one flat file, with tag lines to indicate paths. This will be just as quick -- or quicker? -- than using SQL (I think). Another option would be to use Storable or YAML.
For sure someone(s) around here have done much writing of search engines like this. Any opinions? My intuition tells me that the flat file will actually be fastest, but I really don't know.