http://www.perlmonks.org?node_id=11157419

Bod has asked for the wisdom of the Perl Monks concerning the following question:

I'm writing an XML Sitemap generator based around WWW::Crawl

I want to record the priority to set each entry in the sitemap. My first thought was to use a CSV or similar text file but it could become huge and cumbersome. So what are the alternatives?

I could write this server-side where I have a MariaDB instance running so storage is no problem. But I'm thinking I want to run it client side although I don't really know why. So my choice seems to be to hold the data in a Storable object. Run MariaDB, MySQL or similar locally or use DBD::SQLite from within Perl. No doubt there are other choices...

Which would you do and why?

What would you definitely avoid doing and why?

Replies are listed 'Best First'.
Re: Persistent data
by GrandFather (Saint) on Feb 01, 2024 at 02:49 UTC

    I'd probably go the SQLite route. There is almost no setup or maintenance required.

    Optimising for fewest key strokes only makes sense transmitting to Pluto or beyond
Re: Persistent data
by stevieb (Canon) on Feb 01, 2024 at 06:02 UTC

    If the application/library requires Internet access to work properly, I'd store it server side. I mean if net access is needed, will the data be of any use sitting on the client if there is no access? A DB sounds like the best bet, and server side means no software needs to be installed on the client.

    If you want to keep it client side and are going to serialize it instead of using a DB, I'd go with JSON over Storable. It's cross-platform, you don't need to worry about conflicts in the version of Storable, and it's human-readable/editable with a text editor. It's also a lot easier to fix a corrupt JSON file than it is a Storable one.