http://www.perlmonks.org?node_id=482667


in reply to Locking a database connection

There are mechanisms for locking tables. However, they vary greatly from database to database. This will work - but if you're already having load issues, this increases the database bottleneck by forcing everyone to wait, even if they wouldn't have collided, which could be disastrous.

It would be better to have a unique set of keys. You might not be able to have just one unique key, but you can specify a set of keys, so that you only get, say, one valid entry per customer per day. The second attempt to insert will throw a SQL error, so make sure you wrap the call in an eval and catch it. Ensuring database integrity with the right keys so you never get duplicate data where it would be inappropriate is a major design goal of any db table.

Also, you may (or may not, it's hard to tell) be able to do some rearchitecturing of your perl layer to stop the race conditions. If you're having load problems causing things to back up, there's probably other issues to consider, and it's time to think about performance tuning and perhaps different methodologies to get your job done.

-- Kirby, WhitePages.com

Replies are listed 'Best First'.
Re^2: Locking a database connection
by cosmicperl (Chaplain) on Aug 12, 2005 at 04:29 UTC
    Thanks guys. That has got me thinking. I am currently updating the whole program (over 15,000 lines of code) with speed efficientcy in mind. The unique solution is an obvious one I didn't think of. Thank you.
    At the moment I'm also devising a way that old data be stored in a different database, or at least different tables. That way new data input can be kept at an optimum performance and old data that isn't referenced anywhere near as often wont get in the way.
    Links to good Perl database performance tuning would be much appreciated.

    Lyle