how to manage the caching of the results of long running queries

Greetings:

I have a rails app with a dashboard page. I'm running under Apache/ Passenger. The dashboard page has several queries run via a dashboard () action on a controller. The queries can take 30-60 seconds to complete before the dashboard page is rendered. I'm using cache_page() in the controller and the rendered page is cached in public, with subsequent requests satisfied from cache. This is all fine.

The problem I have is refreshing the cache after it's invalidated (delete files from public via cron). All clients that call the dashboard action while the page cache is empty will cause queries to be run against the database. That is, of course, the kind of thing I'm trying to avoid by using cache_page().

What are my options?

Thanks.

Depends on how many queries they are and how long they take individually. If it's one query that takes 30 seconds, then look at breaking it up or optimizing it. If it's several queries that take a few seconds each, look into caching their results with Rails.cache.fetch. Then if a request hits while the first page is 15 seconds into building, it can at least take advantage of those 15 seconds, and only duplicate the last 15.

This sounds like fragment caching rather than page caching. Is that right?

You could do that, but if the long running queries are executed in the controller before you get to rendering (where fragment caching is done), then it won't do much good against the problem you are trying to solve, because it will execute the queries for 30 seconds before getting to the rendering.

@projects = Rails.cache.fetch("user#{current_user.id}_projects") { current_user.projects.with_long_running_query } @tasks = Rails.cache.fetch("user#{current_user.id}_tasks") { @projects.map {|project| project.tasks.with_some_special_conditions }

Or something like that...