Controlling concurrency

What do you think is the best way to manage concurrency in this (made up) scenario?

1) You are selling Orders for items of various ProductVariants, belonging to one Product 2) There may be an upper limit to how many Orders you can take for any given Product 3) For our example today, you can only accept three more orders for any combination of ProductVariants of a Product 4) Two users put in three Orders, at the same time

It's in the realm of millisecond possibility, then, for this to happen:

Person 1: Validate that 3 Orders can be made for Product 123 => true Person 2: Validate that 3 Orders can be made for Product 123 => true Person 1: Orders.create(:product_variant => 9); Orders.create (:product_variant => 8); Orders.create(:product_variant => 7) Person 2: Orders.create(:product_variant => 6); Orders.create (:product_variant => 5); Orders.create(:product_variant => 4)

Oops! 6 Orders placed!

What do you think is the best way to prevent overselling in this case? I've considered:

* Table locking (sucks) * Setting a lock file per-Product (filesystem-y and not elegantly scalable past one machine) * Setting a lock variable per-Product in a memcached store * A single-worker queue that validates and processes orders in sequence * A per Product single-worker queue that validates and processes orders in sequence * Something I haven't considered?

What do you think? Steve!

What do you think is the best way to prevent overselling in this case? I've considered:

* Table locking (sucks) * Setting a lock file per-Product (filesystem-y and not elegantly scalable past one machine) * Setting a lock variable per-Product in a memcached store * A single-worker queue that validates and processes orders in sequence * A per Product single-worker queue that validates and processes orders in sequence * Something I haven't considered?

It sort of depends where the information that only 3 more orders for Product X lives. For example if the products table has a stock_remaining column, then for me the natural thing to do would be to use optimistic locking on the products table I wrote up some possibilities at Dealing with concurrency - Space Vatican

Fred

If you positively cannot have overbooking (oops, airline talk) over-selling, then all your orders have to either lock a table (using the database to enforce serialization), or they have to be serially examined in someway, such as a single order FIFO queue. You could create an observer, for example, that monitors the creation of an order, checks to see if stock (tickets, seats, CD’s, whatever) is available; and gracefully back-out an order that “goes over.” Personally, I would be tempted to try a simple solution, like a queue of some sort first, or perhaps an observer, rather than use the database. This is one of those cases where trying to optimize too early may hurt. A simple, single, global queue may be all you need. YMMV.

Cheers–

Charles

My wife sells jewelry and many pieces are one-of-a-kind. I have an inventory table with an entry for each piece, including location: web site, gallery, etc.

If a customer puts that item in a cart, then the "location" is set to the cart id, and no one else can even see the item, much less purchase it. If the cart times out, the item is returned to inventory and becomes visible again.

Maybe not directly analogous to your situation, but something to think about. :slight_smile:

HTH,

If you positively cannot have overbooking (oops, airline talk) over-selling, then all your orders have to either lock a table (using the database to enforce serialization)

I don't think things have to be quite so drastic, eg a row level lock is almost certainly enough. It does sort of depend on how the quantity information is stored (but there is almost always a better way than a table lock)

Fred

, or they have to be serially examined in someway,

@Fred: The information that only n more products exist will come from (@product.order_limit - @product.orders.count) >= n. It's a bit more complicated than that, because I plan to have optional per- ProductVariant limits too, but let's leave it there for now.

@Fred & @Hassan: I don't really want to have "unsold" ProductVariants hanging around in the database, because the availability of ProductVariants is largely procedural. If the order_limit on the Product as a whole was 20, the order_limit on ProductVariant A was 10, and ProductVariant B had no order_limit, then I'd end up with a whole lot of questions as to how many unsold ProductVariants to create of each type, and which ones to destroy when others were sold. Bah, too much mess!

@Charles: I like the idea of an OrderTaker global queue, but global queue means worker process, and worker process means extra messy infrastructure, right? I'm all for it, I'm just wondering if anyone has a better mousetrap.

@Fred: Row-level locks are out of the question, since I'm not interested in blocking updates to a given row, but the creation of rows that don't yet exist in the Orders table. That said, I found this today – named locks in MySQL. It's a database-specific solution, but perhaps an easy one?

http://gist.github.com/95977

Named locks in MySQL: MySQL :: MySQL 8.0 Reference Manual :: 12.24 Miscellaneous Functions Shopify's locking implementation: http://github.com/Shopify/locking/blob/835469ed48f4c2de95856fe2f221eaa624b267a2/lib/locking.rb

@Fred: Row-level locks are out of the question, since I'm not interested in blocking updates to a given row, but the creation of rows that don't yet exist in the Orders table. That said, I found this

Actually that's not true (and mentioned in the blog post I linked). You can lock the row corresponding to a given product, even if you are not going to update that product, purely as a synchronization mechanism (and in that sense it is similar to the named lock stuff).

Fred