threaded?

Hi, I admit I am relatively new to RoR, but, I have a question about threading... I have my ruby app and one of the things it can do is import a huge quantity of data into the database via the web interface. When I kick off this command no other requests to ruby respond. I have to wait until the command finishes. A similar app (we are enhancing and porting an existing app) running under Tomcat/Java servlets does the same action but Tomcat still lets me serve other requests while it is happening.

Is this a setting I need to set, or is Ruby only capable of switching requests when idle? Or some other restriction?

As we get closer to live this is becoming more of an issue as we can't have our live site frozen when one of us is importing new data.

Help!

Thanks Phil

Hi, I admit I am relatively new to RoR, but, I have a question about threading... I have my ruby app and one of the things it can do is import a huge quantity of data into the database via the web interface. When I kick off this command no other requests to ruby respond. I have to wait until the command finishes. A similar app (we are enhancing and porting an existing app) running under Tomcat/Java servlets does the same action but Tomcat still lets me serve other requests while it is happening.

Is this a setting I need to set, or is Ruby only capable of switching requests when idle? Or some other restriction?

Rails is single-threaded. To handle multiple requests in parallel you need to add more Ruby processes. I suggest you check out <http://mongrel.rubyforge.org/docs/index.html&gt; for a lot of info on RoR deployment.

If you expect a lot of concurrent long-lived requests (e.g. large file uploads), you should consider handing those off to a threading-friendly framework, e.g. Merb, as RoR-instances are somewhat heavy.

Regards, Isak

Isak Hansen wrote:

Rails is single-threaded. To handle multiple requests in parallel you need to add more Ruby processes.    On this topic, I have a small problem with lighttpd with this configuration : proxy.balance = "fair" proxy.server = ( "" =>   ( ( "host" => "127.0.0.1", "port" => 8000 ),     ( "host" => "127.0.0.1", "port" => 8001 ),     ( "host" => "127.0.0.1", "port" => 8002 ) ) )

It seems that with this configuration the lighttpd (1.4.18) proxy module can forward request to an already busy server even if there are only 2 simultaneous page accesses.

proxy.balance = "fair" should try to avoid this situation but I believe that in my situation (Mongrel is used to serve static files too which is probably the case for many others), the static files serving can force lighttpd to use all Mongrels at once even if there are only 2 simultaneous page accesses (there can be 5-10 static files to serve when counting images/css/js for one page load) -> the Mongrel handling the long running request is reused for another dynamic request, which blocks.

I believe reconfiguring lighttpd to short-circuit Mongrel for static files could help there by lowering the load on the mongrel cluster which would make sure one of the 3 Mongrels is available even if there are 2 simultaneous page accesses, but I wonder if there is another way which would involve less lighttpd configuration? In fact I don't see a quick lighttpd configuration for the "directly serving static content" when using mod_proxy yet.

Lionel

Outside of making sure you have multiple Rails processes proxied through an HTTP server, you should also use a tool like BackgrounDrb for such specifically long-taking tasks as uploading and processing a ton of data. Then, when the user uploads the data, the page comes back as soon as the data is done uploading, while a separate process on the server is doing the actual processing of said data.

Jason

Jason Roelofs wrote:

Outside of making sure you have multiple Rails processes proxied through an HTTP server, you should also use a tool like BackgrounDrb for such specifically long-taking tasks as uploading and processing a ton of data. Then, when the user uploads the data, the page comes back as soon as the data is done uploading, while a separate process on the server is doing the actual processing of said data.

That's a solution of course, but in my case these long running processes usually only run for 5-10 seconds and are the least used of the site. There's no usability problem for the person doing them: 5-10 seconds are considered normal, but for simple site navigation this is annoying (which as explained happen sometimes in my case). Developping a BackgroundRB solution would solve the problem but might be heavy handed for the time being (if there's a simple lighttpd solution for serving static files without the mongrel cluster used by mod_proxy).

Lionel

I’m not knowledgeable with Lighty, but I do know that there are pretty major issues with the most recent proxy engine. I personally use Nginx + Mongrel Cluster which works surprisingly well. Nginx is also one of the fastest as serving up static data, wether it be images, javascript, stylesheets, or cached Rails pages, it’s fast and completely bypasses proxying.

Jason

This concerns me a little. We have a website that needs to support 150 - 200 simultaneous active users. Are you telling me I need to have 150 - 200 mongrels running to do this? That's crazy!

"Active users" isn't a useful metric on it's own, try to break it down into requests per second instead. And keep in mind that production mode is likely to speed your actions up a little.

Regards, Isak

phil wrote:

This concerns me a little. We have a website that needs to support 150

  • 200 simultaneous active users. Are you telling me I need to have 150
  • 200 mongrels running to do this? That’s crazy!

Depending on your application and hardware specifics, an average Mongrel can serve about 100 requests/second. So you’ll want to run a cluster of at least two and probably more just to smooth everything out.

– Roderick van Domburg http://www.nedforce.com

Posted via http://www.ruby-forum.com/.

Jason