Rails caching with both Solid Cache and Memcached/Redis – case `rate_limit`

Rails 7.1 introduced Solid Cache as a default cache store, and Rails 7.2 introduced native rate limiting.

In my Rails 8 application in production environment, I’m already using :solid_cache_store as the global config.cache_store for general-purpose caching (e.g. fragment caching, job deduplication, etc.). I’ve a dedicated server in the cloud for that.

However, I’m unsure whether Solid Cache is the right store for rate limiting specifically — since Solid Cache is persistent and database-backed, it may not perform well for high-throughput, ephemeral data like request counts or burst limits.

Given that the rate_limit DSL allows a custom :store option, is it a good idea to use a dedicated in-memory cache (e.g. Memcached or Redis) just for rate limiting? Something like:

rate_limit store: ActiveSupport::Cache::MemCacheStore.new(ENV["MEMCACHE_SERVERS"].split(",")), ...

Would it be best practice to:

  • Use :solid_cache_store for general application caching, and
  • Use :mem_cache_store (or Redis) only for rate limiting?

Are there any downsides to this dual-cache setup (a part costs)?