"be consistent" | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
There are mechanisms for locking tables. However, they vary greatly from database to database. This will work - but if you're already having load issues, this increases the database bottleneck by forcing everyone to wait, even if they wouldn't have collided, which could be disastrous.
It would be better to have a unique set of keys. You might not be able to have just one unique key, but you can specify a set of keys, so that you only get, say, one valid entry per customer per day. The second attempt to insert will throw a SQL error, so make sure you wrap the call in an eval and catch it. Ensuring database integrity with the right keys so you never get duplicate data where it would be inappropriate is a major design goal of any db table. Also, you may (or may not, it's hard to tell) be able to do some rearchitecturing of your perl layer to stop the race conditions. If you're having load problems causing things to back up, there's probably other issues to consider, and it's time to think about performance tuning and perhaps different methodologies to get your job done. -- Kirby, WhitePages.com In reply to Re: Locking a database connection
by kirbyk
|
|