I'm not Ezra, but I've had experience with the differences between
setting that to sql and setting it to ruby. The main difference is
when its ruby, rails uses the rake task db:test:clone, which uses
schema.rb, and when its sql rails uses db:test:clone_structure, which
uses development_structure.sql. This comes out of looking through
databases.rake w/i rails, btw, its pretty easy to follow.
Now, what that actually means is that when you are using ruby, you are
constrained to whatever Rails' SchemaDumper supports - meaning
whatever DHH and core believe is the "rails way". So things like
weird primary keys, mysql enums or other db specific types, and stored
procs are out - schema dumper will either ignore them or may error if
it sees them.
When you are using sql, you are basically doing a straight sql load
from your dev db to test - meaning you can use a lot more crazy
database stuff. For example, in one of my projects we have a legacy
db with auto-increment primary keys named things like "foo_table_id" -
when schema dumper tried to convert these, it wouldn't recognize them
because it expects primary keys to be "id" and only "id". So the
fields would be created, but wouldn't be set to be auto increment
primary key - and we got no warning of this. Luckily, our test suite
turned this up before we had any real damage, and switching schema
format to sql fixed things.
As far as how this effects production - regardless of what format you
use, rails uses ActiveRecord::Migrator to do the migrations - but
_then_ after a migration completes rails will dump the schema if ruby
is used, but do nothing if sql is used. I'm guessing thats because
the sql structure will be updated regardless if you are using sql and
run a test, but I'm not sure.
Hope that helps - moral of the story is you probably want to use sql
if you have a legacy db or db specific things, and if you have a
greenfield app that follows the rails way stick with ruby schema.