I need to upload a large set of data (the cvs file is about 70MB) into
some look up tables. Using standard migration api would be tedious. I
know that there's a way to export data into YAML file and populate
data using this file. However, I think this is still heavy.
Is there a way to upload data using a separate sql file in migration?
# Get all that research data loaded!
def self.up
say_with_time("Create Tables and Indexes...") { create_tables_and_indexes }
say "Reset all models"; reset_all_models
say_with_time("Import all data...") { import_data }
end
The import_data method used FasterCSV to load a 1.6Mb CSV file into three tables with ActiveRecord models. The thing to remember is that you can do *anything* in your migrations. If the data were already in an SQL file, use the Revolution Health approach. If your data is in CSV, use FasterCSV. The "migration api" only deals directly with the schema changes.