http://www.perlmonks.org?node_id=518463

diego_de_lima has asked for the wisdom of the Perl Monks concerning the following question:

Hi Monks,

My application: I have a web application using Linux/Apache/Postgres/Perl. This application makes CRUD operations on many DB tables, but some o them are very large tables (50, 100 columns) and are often updated by many people (the same row, updated many times).

My problem: when more than one user open a form for Update on the same row, they don't know that someone else is also updating the row. So, when the first one submits the form, the second still have the old data on screen, and when he submits it, the data the first one submited is completely lost! And this is very, very bad.

It's not a problem of table or row LOCKING, but simply signaling the second user that the row is open for update and he's gonna have to wait some minutes. Just like opening the same file in a network share: the second user can only read the file.

Some solutions:

1. At the end of the page, create a loop while the page is open, keeping a lock file somewhere that the second process trying to open the row can see and alert. Pros: no daemons needed. Cons: I don't want to have a big Apache/mod_perl process looping and dedicated for minutes...

2. Use AJAX to call a CGI every 10 seconds to keep the lock file fresh. Pros: no daemon needed. Cons: too many requests and the 10 seconds thing is gonna be a pain to keep working.

3. Create a simple/light Perl HTTP daemon. The html form has a hidden IFRAME that connect to it like a web/chat application, touching a lock file and deleting it at the end. Pros: a simple http daemon is fast and very lightweight. Cons: a simple HTTP daemon may be very insecure, and needs another port to run, wich some firewalls can block.

So, finally, what are you people using to do this kind of thing? I'm thinking seriously of staying with the third option...

Diego de Lima