RESTful app rejecting simultaneous requests?

Hi you all, I've made a RESTful app and another client application is making requests to it. This client application is a kind of crawler (I'm not involved in it), and the programmer has told me that my app is rejecting simultaneous requests of its same client (the crawler process). Can it be possible? What do I have to do to avoid it?

Thanks.

Hi you all, I've made a RESTful app and another client application is making requests to it. This client application is a kind of crawler (I'm not involved in it), and the programmer has told me that my app is rejecting simultaneous requests of its same client (the crawler process). Can it be possible? What do I have to do to avoid it?

Are you only running a single mongrel? if so then that's normal - a single instance of mongrel will only process one rails request at a time. If that's a problem, you should run several mongrels and load balance between them (eg via mod_proxy_balancer or nginx)

Fred

Uf :S No, I'm with WebRick. Does this also happen with WebRick?

Uf :S No, I'm with WebRick. Does this also happen with WebRick?

Yes. It's not a mongrel or a webrick thing, but a rails thing.

Um....... but, Rails does not allowed simultaneous requests from a same client? or even for more than one?

I mean,it is not weird that a same client make simoultaneous requests in a Web app, isn't it? And for different clients...well...it is obviuos is the normal behaviour in a Web app :S

Well, then, how can I run several WebRicks? I mean, do I have to run the app in two different ports or something? If it's long to explain, you can redirect me to a book :S

Thanks a lot.

Frederick Cheung wrote:

Jeez, you’re totally missing the point. No requests are ever simultaneous, even if you are asynchronously loading parts of the page. Each one consist of a separate request and the webserver, whether that’s mongrel, webrick or anything else will handle those in a first come, first go order (i.e. using a queue). Rails does lock the mongrel/webrick server for the course of the request (which is usually less than a few tenths of a second), but all other requests will be handled too in due time.

Best regards

Peter De Berdt

Um....... but, Rails does not allowed simultaneous requests from a same client? or even for more than one?

I mean,it is not weird that a same client make simoultaneous requests in a Web app, isn't it? And for different clients...well...it is obviuos is the normal behaviour in a Web app :S

It's not the nicest bit of rails, but that is just the way things are currently.

Well, then, how can I run several WebRicks? I mean, do I have to run the app in two different ports or something? If it's long to explain, you can redirect me to a book :S

Yup, just run 2 webricks on different ports (or mongrels or whatever) and then load balance between them

This is the dirty little secret about Rails. One mongrel = one Rails = one request at a time in series.

The conventional setup these days is Apache (2.2) with mod_proxy_balancer to hand off requests to a mongrels cluster with each instance on it own port.

You setup as many mongrels & ports as you think you need to handle peak concurrency (typically 4-10 is plenty for an app just getting started, more based on your CPU & RAM capacity, or if you have some high spikes like auctions or intranet tasks like 500 people showing up to clock in).

Book: Pragmatic Bookshelf: By Developers, For Developers (not in print, but you can get the PDF now)

ok... so Rails locks the server for the course of the request, but all other requests are handled too... so the requests are not lost... Of, course, it's useless then to have two crawlers sending requests at the same time.

Finally, when you talk about "loading balance between two server instances"... how can I do it? I mean, manually (telling the client interface programmer to set the second process to the second instance)? Or is there a way to do it programatically?

Thanks a lot again.

One last question...sorry :frowning: Does this also happen with Apache?

It is not really as bad as it seems so maybe I am missing something about why people complain so vehemently about the no simultaneous requests issue. I mean, sure some may consider it a sloppy way of having to deploy but honestly it has never been a problem as long as you just deploy your application properly.

Like someone else said, you basically need a pack of mongrels (mongrel cluster) and then a front-end load balancer (which is apache or nginx)

Here is a good article on that:

http://blog.codahale.com/2006/06/19/time-for-a-grown-up-server-rails-mongrel-apache-capistrano-and-you/

Another

http://www.railsjitsu.com/installing-and-configuring-nginx-and-mongrel-for-rails

May I recommend my personal favorites:

- Automate your deployment process with Capistrano (with palmtree recipes) - Load-balance your mongrels with nginx - Run 4-10 mongrel clusters to serve the requests - Cache whenever possible to lower server load

Or if you don't want to deal with all that and you want to try a new option, look at switchpipe:

http://groups.google.com/group/switchpipe/web/setting-up-your-own-webapps-with-switchpipe

Hope that helps

Thanks a lot for the info. It seems complex though (having just worked so far with Webrick :slight_smile: ).

Yeah, I know it seems complex. It's not as bad as it seems though. Unfortunately, this is one of the weaker spots in the rails product lifecycle. I definately pray for the day that it is as easy to deploy rails application as it is to build them. Switchpipe is a step in that direction though.

webrick = don’t use it, and if you use it, use it only for development purposes, not for production sites

Best regards

Peter De Berdt