I was prototyping a gui using a SQLite-backed Rails (on Windows). I wrote
a seperate Ruby script to populate an events table with filler data.
As part of some
discussion I got it into my head to just fill up the table and see what kind
of performance issues, metrics or database size issues I would be exposed
to. The script was executed from the command line (i.e. 2 different
The script made Rails unresponsive for a start (just getting stack traces
when it tried to hit the DB), and eventually the script itself bombed out.
"SQLite::Exceptions::SQLException: unable to open database file"
Admittedly this script was a quick and dirty job with a 100.times do..end
loop around it, but I expected SQLite to handle contention better than that.
Multiple processes accessing the database is going to be a core aspect
of a new project we are doing, and I was wondering if this is beyond
SQLite's capability, or if it can be configured in a way to have more
I am curious if this is a problem with non-client server databases in
general. It strikes me that to control this kind of contention you need
some code between your calling application and your database engine,
or the DB adapter code in each process has to be a fair bit smarter
than it needs to be.