I like your idea - it's well thought out and simple. I was thinking it would be a good idea to have data going into two servers not one - double redundancy in case one goes down. This way you have one set of data for analysis, one for backup and one untouched that could be used to resolve issues of vote discrepencies.
For the analysis end, you would do it in duplicate again. If there's a dispcrepency between the two totals, you go back to your other untouched server and look at that data.
How do you deal with the process of making sure that the machines have not been tampered with, the scripts used to count the votes are not altered, etc? Perl would work well here - you can examine the script at any time, unlike compiled programs.
yet another biologist hacking perl....