[Proposal] Rate Limit API

I often have to create some kind of rate limit for security reasons or whatever, any chance of acceptance for a PR on this topic ? The API would be something like below, backed by before_action and activesupport-cache-store.

class AvailabilitiesController < ApplicationController
  rate_limit attempts: 10, wait: 1.hour, only: :show do
    render json: { error: "You've exceeded the maximum number of attempts" }, status: :too_many_requests
  end
end
1 Like

Out of curiosity, why do you need to implement this instead of relying on rack-attack?

I see rack-attack as a more generic approach, ex, a general rate limit for API requests. Here it would be specific for some actions, responding specific content like a redirect with a flash message. Also I think this feature is common enough to be part of rails core and don’t depend on a third party gem

Isn’t this normally implemented inside the API gateway/NGINX ? Because if you implement this inside Rails and you have a load balancer in front of multiple instances, how do you want to make sure that your overall rate limit isn’t exceeded ?

1 Like

Isn’t this normally implemented inside the API gateway/NGINX?

It can be, but two things with this:

  1. There is a history of things that used to be handled by a frontend server and were moved into Rails. Heck, early versions of Rails used to use Apache’s mod_rewrite before routing was eventually moved within Rails. Same for other things like the content security policy.
  2. Moving it within Rails gives you more flexibility. It’s becomes possible that custom Ruby code could determine the rate limits in a very specific way while with things like Nginx you will be more limited in your control. For example, maybe you want different rate limits depending on the account associated with the requester identity. User’s paying more get higher rate limits. I’m not sure the proposed API really supports such an idea but the right API could.

Because if you implement this inside Rails and you have a load balancer in front of multiple instances, how do you want to make sure that your overall rate limit isn’t exceeded?

Probably the easiest way is to use ActiveSupport::Cache::Store and pick a backend such as redis where the cache is shared. Since there is an increment method you can easily increment a shared counter.


Personally for me I don’t have enough need for this to work on it but I don’t think it’s a case of not belonging in Rails.