Explain what you’re doing first.
If you want to cache the stuff to disk to avoid constant processing, then write them to disk and reference in the db. Scaling out to multiple machines is a non-issue - just mount the file system on multiple machines and all is good. Or investigate Amazon S3 for your storage. If you’ve used attachment_fu, you can easily switch from filestore to database store to s3 without much effort.
Why wouldn’t you want to do this close to the model? Why would you want a controller action to be invoked via a rake task?
My implementation would be this:
class Attachment > ActiveRecord::Base
# code that writes to some folder specified by a configuration value
When things are saved, you cache to disk automatically.
When you need to force a rewrite of the files on your file system, use Rake
desc “cache files”
task :write_attachments => :environment do
attachments = Attachment.find(:all)
Personally, I would not store files in the database at all. It seems like a good idea at first but files should be served by the file server. You can write code to replicate your filesystem across servers, you could use a shared folder or an nfs mount, or you could use S3 instead of doing something like this.
Just my .02, and feel free to argue, just hoping to help you find the best solution.