You can provide a single queue, or a list of queues as an array. Jobs will be polled from those queues in order, so for example, with [ real_time, background ], no jobs will be taken from background unless there aren’t any more jobs waiting in real_time.
Rate limited, OpenAI API, 1 at a time concurrency.
I want an OpenAI queue job to run one at a time but never have to wait for the general purpose queue to empty.
For example in other stacks like Elixir w/Oban, I can define the concurrency of each queue but that does not affect the priority of the queue. All three queues run at the same time, just at different concurrency limits.
You can limit concurrency using the limits_concurrency method without even needing a separate queue (SolidQueue docs):
# preferred solution
class RateLimitedJob < ApplicationJob
limits_concurrency to: 1, key: :openai_api
end
This would likely be the simplest approach. If there’s a particular reason why you would prefer having separate queues and not using the limits_concurrency method, you could do something like this:
# alternative solution
class RateLimitedJob < ApplicationJob
queue_as: :openai_api
end