Concurrency, Rails and PostgreSQL vs. MySQL

Let’s say we have users concurrently performing the following queries inside a transaction (example taken directly from PostgreSQL site):

User A:

BEGIN; UPDATE accounts SET balance = balance + 100.00 WHERE acctnum = 12345; … COMMIT;

User B:

BEGIN; … UPDATE accounts SET balance = balance + 100.00 WHERE acctnum = 12345; … COMMIT;

Am I right assuming that in case of PostgreSQL the UPDATE query of User

B would be working with an updated version of the accounts row, whereas in case of MySQL it would update the original one?

Both databases support multiple levels of transaction isolation. Read their docs to see how this affects concurrent transactions.

If I use rails’ optimistic locking feature, I would ensure database-independent behaviour, but would have to implement the handling

of rolled-back transactions on my own, right?

You’d have to rescue ActiveRecord::StaleObjectError and do something sensible, like alert the user of a conflict or try to resolve it yourself.

Are there any best practices for concurrency control with rails out there?

Use optimistic locking when the chance of conflict is reasonably low (users hate to refill data and resolve conflicts).

Use pessimistic locking when concurrency is high and you want to read a bunch of records and update them in the same transaction.

The best way is to try to avoid locking altogether.

Best, jeremy

Jeremy Kemper wrote:

Both databases support multiple levels of transaction isolation. Read their docs to see how this affects concurrent transactions.

Well, actually I did read the docs before asking, I asked the question to make sure, I understood them correctly.

The best way is to try to avoid locking altogether.

What would be the best way to do this?

I have to make a counter which would display the number of views of a certain page. It has to be absolutely accurate even in case of multiple users accessing the page (thus incrementing the counter) simultaneously.

I could, of course, organise the counter as a separate table with users adding a new row with the page_id of the accessed page on each view, but wouldn't be adding hundreds/thousands of new rows, just to measure the number of views, sort of overkill?

At least from reading the MySQL/PostgreSQL docs it seemed to me that using postgres would get rid of the problem alltogether due to MVCC

Alexei

You won’t have trouble in either database. Your best bet is simply ‘update pages set hits=hits+1’. This will obtain an exclusive row lock to do the update.

If you’re doing page = Page.find(…); page.hits += 1; page.save, you need to wrap it in a serialized transaction or lock the page record. Page.find(…, :lock => true)

jeremy

If you update that row via SQL directly, rather than ActiveRecord, you can avoid a problem by issuing an atomic update statement such as:

   update table set counter_column = counter_column + 1

This will avoid the external read-modify-write cycle entirely.