Limit number of requests per second

Hi all,

I have searched for the solution of this problem and found none, for this reason I am writing here.

Basically, what I intend to do is pretty easy, and can be achieved with several typical solutions easily. However I want to know if there is a 'rails way' to do this. Particularly, a 'rails 3 way', since I went with Rails 3 for this project.

I have a typical Rails application, in which the special thing is that I am accessing through SSH to some selected Cisco routers (in order to read some infos from it). Since it is a public applicacion, it could be that too many requests come at the same time and the application can literally 'crash' the router because of too many work.

I want to know if there is some recommended way of setting a limit on how many people can concurrently request something to the controller.

In order to run the application in production mode I will do it with Passenger + Apache/nginx.

Best regards, and thanks in advance, Rafael Fernández López.

Hi,

Since creating a rails app to proxy your SSH connections, you can have your rails application to handle the frontend (passenger + Apache/ nginx) and show stats or whatever you want. For the data collection, you can have a background task that connect via SSH, send commands and store the results in DB (which is read by the rails app). With this method, you will be able to manage multiple connection at the same time. If you also need to send command to reconfigure the router, you can also create a command queue and execute it with another background task (or the same) (this command queue could be filled with a rails app, check resque/redis).

My 2 cent help.

best regards

Christophe

Hi,

Yeah, that could be a solution if that was the case. I didn't explain enough in depth the problem.

When you perform a request, you provide certain arguments that are cointained in the request itself, so it's not only a matter of having a background job that collects information updating the database.

User A requests some kind of action over a selected router, and provides some arguments (e.g. ping a certain IP provided by the user).

So, what I need is to set some kind of limit of how many requests can be concurrently running. This could of course be set on the router itself by limiting the number of SSH sessions, but just wanted to know if there was some 'rails way' of performing this check.

I think I will do a typical solution like shared memory or some kind of lock.

Best regards, Rafael Fernández López.

Hi all,

Finally, the way I fixed this issue has been by creating an initializer that allocates a hash in which I have each router as a key and the number of concurrent connections to it. When a connection is going to be stablished, +1, when the SSH connection ends, -1.

On the controller I check for the value for the desired routers, and if the case, a warning message of 'overload' is reported on the router that exceeds that number of maximum connections.

I don't care when the server is restarted, since that is a really short time window and I also don't care about losing the number of connections on a restart. It's just a matter of not being able to 'fry' the router with too many requests at a time.

Thanks for the time and help.

Best regards, Rafael Fernández López.