question about activerecord test_numeric_fields in base_test.rb

There's a test for activerecord called test_numeric_fields in
base_test.rb. here's the test

  def test_numeric_fields
    m =
      :bank_balance => 1586.43,
      :big_bank_balance => BigDecimal("1000234000567.95"),
      :world_population => 6000000000,
      :my_house_population => 3

    m1 = NumericData.find(
    assert_not_nil m1

    # As with migration_test.rb, we should make world_population >=
    # to cover 64-bit platforms and test it is a Bignum, but the main
    # is that it's an Integer.
    assert_kind_of Integer, m1.world_population
    assert_equal 6000000000, m1.world_population

    assert_kind_of Fixnum, m1.my_house_population
    assert_equal 3, m1.my_house_population

    assert_kind_of BigDecimal, m1.bank_balance
    assert_equal BigDecimal("1586.43"), m1.bank_balance

    assert_kind_of BigDecimal, m1.big_bank_balance
    assert_equal BigDecimal("1000234000567.95"), m1.big_bank_balance

The numeric_data table is defined like this

CREATE TABLE `numeric_data` (
  `id` INTEGER NOT NULL auto_increment PRIMARY KEY,
  `bank_balance` decimal(10,2),
  `big_bank_balance` decimal(15,2),
  `world_population` decimal(10),
  `my_house_population` decimal(2),
  `decimal_number_with_default` decimal(3,2) DEFAULT 2.78
) TYPE=InnoDB;

My question is with assert_kind_of Integer, m1.world_population. Why
would world_population be an integer. It seems like this test makes an
assumption about how a particular database stores the actual data type
internally. To me when defining a field as decimal(10) or lets say
numeric(10) both of these really should be of kind BigDecimal since
that's what the database field is stored as, it's just that the
BigDecimal object should have no numers after the decimal point. Am I
just way off here? Making this assumption for a database like Firebird
where two different databases might be a different dialect. For
example decimal(15,2) in dialect 1 firebird is really a double
precision underneath, in dialect 3 it's really being stored as a
bigint. Regardless of how the database is storing this underneat the
Firebird driver (correctly in my opinion) translate this to a
BigDecimal column. The test above then fails because it's looking for
wrong types. Anyone have any thoughts on this?

numeric with no decimal digits are typically used for big integers, so
mapping them to Ruby big integers is natural and expected. I think
it's a good convention.


So should the Fireruby driver be modified to return the base type that
the database is actually using to store a field? This isn't always
going to be the same for every database for instance a decimal(2)
might be stored as something small in mysql and in turn a test is
expecting a FixNum but in Firebird this field is really stored as an
integer so even If I returned the base type it still might not match
what the test is expecting. Do you see what I mean? I think you're not
really suppose to be seeing the underlying types from a database if a
field is declared as NUMERIC or DECIMAL then the result should be a
BigDecimal it just so happens that maybe that BigDecimal has precision
and maybe not.


I'm not sure I'm parsing you correctly, but I think you're saying that
since you declared numeric type, you always want the Ruby type
representing it to have decimal points even if the numeric has no
decimal points and is not intended to be input, displayed, or
manipulated as a number with decimal points.

Why would you ever want that? It's creating a ton of extra work with
no real benefit other than appearing definitionally consistent.
Numeric columns with no decimal points are used to represent integers,
so we use integers in Ruby; I'm surprised that this is surprising.


I guess my question is more about what the firebird driver should be
doing. Since the internal representation of certain decimal columns in
a firebird database don't match certain tests in the rails test suite,
should I be modifying the Firebird driver to return false results so
that tests run correctly?