Hi, I'm trying to run a set of migration which brings data from a
legacy project. The legacy project is on production and have lots of
data. So, during the time the migration keep allocating and allocating
memory until it grabs it all. I'm doing pagination to bring the data
step by step. Beside, what is very wear is that the memory never goes
down between migration, for example, I do:
class MoveLegacySugarEntries < ActiveRecord::Migration
class Entries < ActiveRecord::Base;end
class Entry::SugarEntry < Entry;end
def self.up
Legacy::SugarReading.each do |le|
u = User.find_by_login(le.user.login)
unless u== nil
TzTime.zone = u.tz
Entry::SugarEntry.create!(
:registered_at => DateTime.parse("#{le.date}
#{le.time.hour}:#{le.time.min}"),
:comment => le.comment,
:value => le.reading,
:user => u)
end
end
end
Where the active record each is implemented like this:
class <<ActiveRecord::Base
def each(limit=1000)
rows = find(:all, :conditions => ["id > ?", 0], :limit => limit)
while rows.any?
rows.each { |record| yield record }
rows = find(:all, :conditions => ["id > ?",
rows.last.id], :limit => limit)
end
self
end
end
So, is getting 1000 rows at a time... Enough to don't exploit the
memory.
Which variable might not be getting out of scope??
Anyway, how would you think possible to get garbage memory from
previous migrations??
For example, if I ran my migration from 0 when I got to this migration
I already have 1.5 gigas memory occupied, isn't kind of wear???