Pessimistic locking doesn't seem to work with mysql?

Hopefully I'm doing something stupid here. I was trying to use pessimistic locking on a rails 2.3.4 project and it just doesn't seem to work. The issue was showing up on both OSX and Linux.

I've created a simplified reproducible for 2.3.10 for discussion purposes. I'll publish the full transcript of the steps I took to create the project later, but for now, I've got a single model Foo with a single Bar value that's a string. I create a single Foo and saved it.

In one console window I run:

Foo.transaction do    f = Foo.find(1, :lock => true)    1.upto(10000000) { |i| puts i } end

I was under the assumption I wouldn't be able to access that row at all until I finished printing out 10 million numbers. But while that is running, I can go to another console window and get to it:

johnmudhead:locktest grant$ script/console Loading development environment (Rails 2.3.10)

f = Foo.find(1)

=> #<Foo id: 1, bar: "AAAAAAAAAAAAAAA!", created_at: "2010-11-01 19:50:58", updated_at: "2010-11-01 19:50:58">

That seems like I'm reading the row when I should be locked out of it.

I have confirmed that I'm using the InnoDB backend on MySQL, which supports row level locking.

Anyone have any advice or tips? Do I have a gross misunderstanding of pessimistic locks in ruby.

A full log of my reproducible case follows...

Thanks,

Grant Olson wrote in post #958678:

Hopefully I'm doing something stupid here. I was trying to use pessimistic locking on a rails 2.3.4 project and it just doesn't seem to work. The issue was showing up on both OSX and Linux.

I've created a simplified reproducible for 2.3.10 for discussion purposes. I'll publish the full transcript of the steps I took to create the project later, but for now, I've got a single model Foo with a single Bar value that's a string. I create a single Foo and saved it.

In one console window I run:

Foo.transaction do    f = Foo.find(1, :lock => true)    1.upto(10000000) { |i| puts i } end

I was under the assumption I wouldn't be able to access that row at all until I finished printing out 10 million numbers. But while that is running, I can go to another console window and get to it:

johnmudhead:locktest grant$ script/console Loading development environment (Rails 2.3.10)

f = Foo.find(1)

=> #<Foo id: 1, bar: "AAAAAAAAAAAAAAA!", created_at: "2010-11-01 19:50:58", updated_at: "2010-11-01 19:50:58">

That seems like I'm reading the row when I should be locked out of it.

I have confirmed that I'm using the InnoDB backend on MySQL, which supports row level locking.

Anyone have any advice or tips? Do I have a gross misunderstanding of pessimistic locks in ruby.

I think you do. If I remember correctly, :lock doesn't do any actual DB-level locking -- it just sets a lock field that ActiveRecord expects. Transactions, OTOH, *do* do DB-level locking, but shouldn't introduce read locks for the sort of operations you're doing. And you're not doing any DB writes here.

Best,

:lock is supposed to do row-level locking via the DB facilities:

http://api.rubyonrails.org/classes/ActiveRecord/Locking/Pessimistic.html

I believe it's the optimistic locking that uses the lock field in active record, and throws a StaleObjectException if things have been modified.

Anyway, we discovered what the problem was. So for the sake of the archives...

:lock won't block on a standard SELECT. Only SELECT FOR UPDATE.

So if I run this code in terminal one:

Foo.transaction do   f = Foo.find(1, :lock => true)   1.upto(10000000) { |i| puts i } end

And this in terminal two while terminal one is spitting out numbers:

Foo.transaction do   puts Foo.find(1, :lock => true).inspect end

The second process will block, even for that read-only operation, until the transaction in the first process ends.