Just another Perl shrine | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
My current hobby project is an AIM bot that can be used with our service request system to track tickets and request queues. It uses a forking dispatcher that responds to user questions, scheduler and socket events. All actions are plugins that can be added/unloaded and reloaded by the administrators. Fun. I'm currently tackling the configuration management. There's user data (what timezone, what level of access, which queues, what tickets, etc, etc), queue data, ticket data, etc. Initially I just used a YAML file I loaded the data from at start and that I wrote to whenever something changed. Then, when the dispatcher became forking, I added file-locking. Then, because the tickets and queues change more frequently than user data, I added more files. Hmmmm. That's beginning to look like a job for a database. So I changed to DBD::SQLite2 and Class::DBI::Cacheable. Easy locking, still instant access to data, and very scaleable. But now in stead of simple hash lookups I need to think in RDB terms. There is the overhead of maintaining a database schema. Class::DBI has it's quirks and is not really very small. Same goes for DBI, and DBD By the way, did I mention I only expect a handfull of users, a dozen or so queues and at most a couple of dozen tickets to be tracked at the same time? So, I sat back and thought it over. Am I going complety over the top with my design? Definitively. Is it a bad thing? Hmmmm. Speed is not an absolute requirement. SQLite is not a huge resource hog. And who knows, if the bot works as I designed, it may become more popular than I expect. That happened at times with other tools I made. At least I now have the flexibility to change the underlying storage to suit my needs. So I decided to continue with Class::DBI. In reply to Over-designing....or not? by redlemon
|
|