Key/Value Database Design

What are the Rails community's thoughts on key/value tables in database design? Specifically, two separate tables: keys, and values. It seems as though this technique is not very elegantly supported in Rails. You can get functionality through the use of has_many :through and some auxiliary key -> value mapping methods by iterating through both lists and joining the results into a hash, but it still seems... ugly.

Ideally, I'd love to NOT have to resort to this option, but my work calls for it as we deal with a lot of data whose origins we do not know, and creating new fields in the database for each key is not practical.

Thoughts?

I just pasted this code for someone esle a few days ago but I'll paste it again here. THis allows you to treat a model like a hash of key=>value pairs. I'm not sure if this is exactly what you are asking for but maybe it is,

     create_table "db_hashes", :force => true do |t|        t.column "key", :string, :limit => 40, :default => "", :null => false        t.column "value", :string, :default => ""      end

class DbHash < ActiveRecord::Base    class << self      def (key)        pair = find_by_key(key.to_s)        pair.value unless pair.nil?      end

     def =(key, value)        pair = find_by_key(key)        unless pair          pair = new          pair.key, pair.value = key.to_s, value          pair.save        else          pair.value = value          pair.save        end        value      end

     def to_hash        Hash[ *find_all.map { |pair| [pair.key, pair.value] }.flatten ]      end    end end

  And then you can use it like a hash:

dbhash = DbHash.new

dbhash[:foo] = 'bar'

-Ezra

> What are the Rails community's thoughts on key/value tables in
> database > design? Specifically, two separate tables: keys, and values. It seems > as though this technique is not very elegantly supported in Rails. You > can get functionality through the use of has_many :through and some > auxiliary key -> value mapping methods by iterating through both lists > and joining the results into a hash, but it still seems... ugly. > > Ideally, I'd love to NOT have to resort to this option, but my work > calls for it as we deal with a lot of data whose origins we do not > know, and creating new fields in the database for each key is not > practical.

you could do something like 3 columns. ResourceID/URI, KeyName/URI, Value. then you can add as many key types as you want. id use BerkeleyDB for this rather than SQL, since its essentailly a giant scalable hash table. Redland or ARDF will let you query the store with SPARQL/RASQAL/etc.

in AR, you could do something like Key.find_by_name("keyname").values i think.. but it would be sort of hacked together with auxiliary integer keys unless you rewrote some of the joins..

Google: Results 1 - 1 of about 2 for acts_as_hash. (0.22 seconds)