Controller Synchronisation - ways to decouple 'work' from the controller

Hi all,
I have been experimenting with rails and have hit a specific
limitation that I need some opinions on. This isn't a criticism or a
gripe as I'm a big fan of the framework! First off, the answer to this
question is not the 'pack of mongrels', as this question is not about
'outward application concurrency', it is about internal controllers
within a single mongrel instance - hence this would afflict all of the
mongrels in 'the pack'.

So I have a controller supporting RESTian interactions with an
aribtrary resource, it takes a request from the client, the controller
and appropriate action is activated and some work is performed to
source a representation of the relevant resource. At this point - when
I'm busy with my action and before I 'respond' to the client, my
controller is effectively blocking all other inbound requests to this
mongrel instance(?).

Now - when my action executes sometimes it will complete immediately,
hence the request/response loop via the rails controller will be
negligible and I have no issue with this processing model. However in
many cases, the work I 'need' to do involves triggering integration
with remote or 'back-end' data-sources making my controller request/
response loop non-deterministic in terms of response time.

So what I'm doing currently is as follows:

1. Accept request from client, and trigger relevant controller/action
2. Create local, persisted entity to represent the 'action' being
progressed - assigning a unique id internally so it is referenceable.
3. Detatch a processing thread from the controller/action to deal with
the complex stuff on-behalf of the controller.
4. If the controller decides that the action thread is taking too long
to complete (based on system policy, client profile or custom headers
in the request), clear down the controller by sending a 202-Accepted
response to the waiting client, passing the location: URI pointing to
the persisted entity in step 2.
5. Accept subsequent GET operations on the URI created in step 2
pointing to the persisted work entity - effectively allowing the
client to poll the state of the action working asynchronously.
6. Ensure the worker thread updates the persisted entity with state
and additional meta-data such that the client is able to resolve a
path to the 'result/representation' requested in 1.

This much is working fine after a little grapple with the rails
threading/concurrency blurb, and significantly ditching backgroundDrb
as an option on the basis it's too heavy for this kind of simple
processing model in my opinion. However - the big problem with this is
still that I can only service one request per mongrel despite, despite
delegating the bulk of all processing to threads keeping my
controllers very clean/simple.

So - to my question. Given that I have achieved this kind of
processing loop. And given that in steps 3 and 4, the bulk of the work
being done is part of a new thread and therefore not part of the now-
idle controller. Is there a way I can begin servicing more requests
from clients, effectively enabling more controller 'instances' to
detatch worker threads too? If so that would mean I could realise the
potential of this framework and being handling many synch/asynch
interactions through a single mongrel instance.

Would I gain anything from JRuby here? I'm assuming that the same
Mongrel code, with the single controller active at one time, will be
identical in the JRuby world too?

Hope this makes sense !

Thanks for any opinions on this...

Stew

So I have a controller supporting RESTian interactions with an
aribtrary resource, it takes a request from the client, the controller
and appropriate action is activated and some work is performed to
source a representation of the relevant resource. At this point - when
I'm busy with my action and before I 'respond' to the client, my
controller is effectively blocking all other inbound requests to this
mongrel instance(?).

Correct

Now - when my action executes sometimes it will complete immediately,
hence the request/response loop via the rails controller will be
negligible and I have no issue with this processing model. However in
many cases, the work I 'need' to do involves triggering integration
with remote or 'back-end' data-sources making my controller request/
response loop non-deterministic in terms of response time.

So - to my question. Given that I have achieved this kind of
processing loop. And given that in steps 3 and 4, the bulk of the work
being done is part of a new thread and therefore not part of the now-
idle controller. Is there a way I can begin servicing more requests
from clients, effectively enabling more controller 'instances' to
detatch worker threads too? If so that would mean I could realise the
potential of this framework and being handling many synch/asynch
interactions through a single mongrel instance.

As I understand it there is a great big mutex around 'dispatch the
request to rails', which implies that the answer to your question is no.

There is another way that this can be done: write a customer mongrel
handler: mongrel itself is not single threaded and will happily
process concurrent requests to non rails urls. Obviously you lose the
convenience of action controller/action pack, but if all you're doing
is eventually redirecting someone to a relevant place that may not be
a concern.

Fred