|We don't bite newbies here... much|
Testing database updates.by chrestomanci (Priest)
|on Apr 18, 2012 at 08:49 UTC||Need Help??|
chrestomanci has asked for the
wisdom of the Perl Monks concerning the following question:
Greetings wise brothers, I seek your wisdom in the matter preserving order and correctness, and preventing mistakes.
I am working on re-factoring and tidying a log parsing script. The script reads a custom log file, extracts relevant information, and then writes summary information into a database and a number of text and XML report files.
I need to make sure that the new version makes the same overall database changes as the old. I don't care about the order of inserts, or about any reads, just that the overall state at the end is what is should be.
The idea I had for my test script was to dump the database to text after running the test script, and to compare it with expected output, or better to somehow ingest the whole database into a big perl data structure and then use is_deeply() (from Test::More, to compare the actual with expected database state.
Is there a more efficient way of doing this? Is there already a module on CPAN for the purpose? or one that stores everything in a big perl hash? I did a search for "perl test database" and found modules such as Test::MockDBI and DBD::Mock which appear to be there to create error conditions for database testing rather than to record what DB changes are made.
Some background: the current log script is a horrible mass of spaghetti with a lot of technical debt, and every time features are added (frequently) bugs are introduced and time is wasted. Naturally there is no specification for what the script should be doing other than the source code, and the occasional comment. I am looking to re-factor the code by breaking it up into a number of smaller, well specified modules, each with tests, but I need to make sure that the new version produces the same reports as the old.