Data Migrations using yaml, what to do about large text fields

Hello,

I have a data in a FileMaker database that I migrate to my evolving rails app via migrations via yaml.

This works great because I can change my application schema easily, and then output a matching yaml file to migrate the data.

The problem that I am having now is that I want to add text fields to the application, but the large chunks of text make my yaml file fail during migration.

I would like to keep a simple text format for my data. Then migrate the data in this format to my rails application.

I have explored using xml, but it adds too much complexity.

Have thought of some combination of yaml for the simple fields and then xml for the few fields that contain large text blobs.

I suspect that other's have come across the same problem and may have some useful suggestions.

Thanks in advance :slight_smile:

- Pete

I have a data in a FileMaker database that I migrate to my evolving rails app via migrations via yaml.

This works great because I can change my application schema easily, and then output a matching yaml file to migrate the data.

The problem that I am having now is that I want to add text fields to the application, but the large chunks of text make my yaml file fail during migration.

I would like to keep a simple text format for my data. Then migrate the data in this format to my rails application.

I have explored using xml, but it adds too much complexity.

Have thought of some combination of yaml for the simple fields and then xml for the few fields that contain large text blobs.

I suspect that other's have come across the same problem and may have some useful suggestions.

Escape the text blobs in the yaml. Maybe turn them into base64 encoded strings. Or for each text field write it out to a real text file (unique filename) and reference that in the yaml. Not as elegant, but both should work.. and if this is motsly a one time thing, not too hackish :slight_smile: