Rodrigo Rosauro wrote:
I think that making rails thread-safe would greatly expand its
usability.
A 'thread' is like a 'goto'. It's useful in a pinch, but it's really bad structure.
Today, having background daemon threads is a pain in the azz because you
need to start and control other ruby processes.
If you brought them closer to the Rails that serves pages, they would donate to it their fragility. They work best in other processes, with DRb between them and you, for insulation.
If you need to query them too often, then you probably have a latent time-sliced architecture. Converting that into something truly event-driven would probably improve its design, and make it leaner and more robust.
Next, threads and processes are a pain in the nuts to unit test. The blame doesn't belong to the tests. When you thread, and when you use semaphores to synchronize your threads, you commit the ultimate failure of encapsulation. You make one thread sensitive to variations in the tiniest private details of another thread's timing.
There are even some things completely impossible to do with rails, like
as event-driven AJAX, or AJAX observer (like that GMail chat)... Which,
personally, I think is one of the big deals of ajax to deliver
instantaneous responses...
I currently use BackgrounDRb in two projects. For both of them, I first tried the non-detached version. They both didn't work. One, zipping up a bunch of files, took too long. So I let the user click, start a task, start a spinning graphic on the user interface, and wait for the task to end. A periodically_call_remote polls the task. When it finishes, the remote call returns JavaScript that sets a small IFRAME's src attribute to the server location of the zipped file. That pops up the Save File dialog automatically.
For the other project, a game, I need abstract timing events to update different web browsers simultaneously, in real-time. Again, I first tried that without a background process, and it was too slow and clumsy. So I put the event timer, and its state table, into a BackgrounDRb process, and I use Juggernaut to send new JavaScript updates to both browsers.
GMail is similar. They hacked their server to leave a TCP channel open, providing "server push". Don't do that if you are not Google, and don't have a horde of engineers to hack your server. Juggernaut is a nice compromise if you already use Flash. It uses Flash to leave that wire open, enabling server-push.
And never think for a second that a combination of BackgrounDRb and traditional Ajax will be more responsive than old-school Ajax. Speaking from personal experience, you will get complaints from your ISP that you are using too much CPU time on their server!