Paperclip temp files eating my server!

I am using the Rails console to move some files from one site to another. Inside a loop, I use the technique* advised in the Paperclip wiki to set the remote_url of the attachment to the correct URL, and then call save on the model, which copies the file over and runs it through the various resize/transcode steps defined in my attachment. Unfortunately, this does not seem to clean up after itself correctly, and I end up with a zillion files in the temp folder, and yesterday the server ran out of free space and stopped working.

I used ensure to clean up, but that's not a very friendly solution except when nobody else is using the server.

Asset.current.where('assets.id > 31061').find_each do |a|   begin     # raw_file on the remote site returns a string with the URL of the file on S3     a.blob_remote_url = open("http://example.org/assets/#\{a\.id\}/raw\_file"\).read     a.save   rescue     a.destroy   ensure     FileUtils.rm Dir.glob('/tmp/*.jpg')     FileUtils.rm Dir.glob('/tmp/*.JPG')     FileUtils.rm Dir.glob('/tmp/*.mov')     FileUtils.rm Dir.glob('/tmp/*.MOV')     FileUtils.rm Dir.glob('/tmp/*.mp4')     FileUtils.rm Dir.glob('/tmp/open-uri*')   end end

Inside the Paperclip gem, the URI adapter uses open-uri and the open call to get a handle on the remote file, and since it's used inside a block, that handle should collapse when the request is done and be garbage collected. I don't see where the temp file is being created, or why it lingers.

Is the issue because I am inside an irb loop? Does that have some separate context that prevents garbage collection?

Thanks,

Walter

*Attachment downloaded from a URL · thoughtbot/paperclip Wiki · GitHub