Instead of different databases per customer, why not use different
tables in the same database per customer ?
No need to mess around with connections, just do something like
Model.table_name = 'table_'+session[:customer_id] before doing anything
I think it would work. Not sure where to put this so you don't have to
repeat this every time you need to use a model. Maybe there's some sort
of hook that can be used ?
Or would this be a bad idea ?
We are looking at potentially 50 accounts each with 50 users, sending
a request every minute. Therefore it is 2500 requests a minute. I am
hesitant about using stored procedures as they don't seem to be
recommended in rails: http://wiki.rubyonrails.org/rails/pages/StoredProcedures.
I am looking at connection pooling at the moment. We thought of using
a separate database for each customer. Then you would switch
connections based on the customer to connect to the customer's
database. I am looking at caching the connection to each customer's
DB, however Rails does not like it. If I try and get a connection to
a current customer's DB, and store it in a hash:
@@connections[:account_id] = ActiveRecord::Base.connection
later (i.e. several requests down the line) when I try and restore it
to avoid creating a new connection:
ActiveRecord::Base.connection = @@connections[:account_id]
the error "Lost Connection to MySQL" appears. It seems that rails
only keeps the last connection used by ActiveRecord::Base open. If i
assign the connection for ActiveRecord to say connection "B", when I
attempt to reassign a connection "A" i put in @@connections to
ActiveRecord::Base the connection is lost. I am not sure what is going
on here but any ideas would be great?
Piet: Thanks, that is something we are looking at as you can indeed
set the table name for a model. It may just become slightly messy
with different tables scattered throughout the database although it is
certainly a viable option.
Does anyone have any experience with this type of design in rails:
i.e. a database per customer type architecture?
How large are the records? Can your rdbms engine of choice handle the
load and further growth on a single server, or do you need to split it
across multiple boxes anyway?
How will you be using the data? Would partitioning help if your db supports it?
Do you expect load to be constant or have spikes?
I'd make a new post on a support forum for the particular db you're
using to figure out the ideal way of doing it, and only then start
looking for rails specific solutions.
The RDBMS is MySQL 5. The records themselves are relatively small:
about 6 fields each containing a 20 character string. The data will
primarily be used in a 'live' view - it is tracking data for vehicles;
whereby only the records from the last day is of interest, and for
reporting, whereby records from up to 6 months will be searched for
and accessed. Partitioning may help: we could split the main Updates
table up into tables per tenant, or on a date basis: however I am
reluctant to use this approach because it is not particularly suited
to the way Rails deals with models and ActiveRecord. By partitioning
the data into tables per tenant or on a historical basis would mean I
would need to be changing table names in ActiveRecord. The other
thought is long term scalability: using a database per customer would
mean the system was self scaling. Even partitioning the data in a
single database could mean the database grew to the point where it
needed to be spread across multiple servers. Load will be constant
during daylight hours: vehicles send updates of their locations every
minute, and customers have users logged in to view this data in
realtime via a mapping interface. So load on both inserts and selects
on the Updates table will be pretty high during working hours, and non
existent outside these hours.
I think this is a rails specific problem: although it is a multi-
tenant architecture problem, Rails adds its own unique twists:
particularly because of its 'single database' approach.
Thanks in advance for any response