Commit every Nth INSERT in migration

Using AR create() is just horribly slow. You'll get significant
improvement by batching like Trevor showed (not sure why the way you
were doing it didn't work).

But for batch loads, you don't need all the extra stuff AR is doing. I
would suggest building up your own SQL, which isn't hard to do. You
can still use the AR connection.

Try something along these lines:

  BATCH_SIZE = 250
  cols = "postal_code,country_id,city,latitude,longitude,state_id"
  conn = ZipCode.connection
  FasterCSV.read('input.csv').each_slice(BATCH_SIZE) do |batch|
    values = batch.map {|row| "(#{row.map \
      {|col| conn.quote(col)}.join(',')})"}
    sql = "insert into zip_codes (#{cols}) values #{values}"
    conn.execute sql
  end

I'm just building up a properly quoted insert statement (multiple rows
inserted in one statement, which is MySQL-specific, but hey...)

My own testing shows this to be about 12-15x faster than using AR
create() in batches of 250 rows, and 50-60x faster than AR create()
with commits after each row.

So AR sucks and Ruby sucks or whatever. But something like this will
get your data loaded quickly. If you need to load truly huge amounts
of data you're probably better off using database specific bulk
loading tools.