We are at the beginning stages of a website and are trying to decide
whether to use Rails or PHP for it, as it will have around 300+ users
online at the same time (mostly during the nights). I ask because I
know Ruby is a little slow compared to a language like PHP.
I would rather develop it in Rails as I'm more familiar with it and
can get things done much faster, but if it can't handle this many
users than we'd probably need to move towards PHP or something.
By "handle", I basically mean will pages load quickly given enough
server resources (this will be on a dedicated server, for the site
only)? I see some Rails sites and it takes a good 15-20 seconds
sometimes to fully load (based on Safari's progress bar) for them.
The site is mainly going to be pulling information from a database
that isn't constantly changing; we will add content to it a couple of
times per week, so the whole site isn't really "dyanmic". I'm sure we
could use caching in parts to help speed it up?
This really is a dead issue. 300 users at the same time is not a measurable metric. The real thing you need to look at is number of requests you can process in a given period. You need to determine your threshold. What is an acceptable level of requests per second.
First, find out how many requests per second your web server can handle… test against static pages. Then figure out what your acceptable level should be. Is it 5 requests per second? 50? 100?
I would say that Rails is just as fast as PHP or Java for most applications. And you can really blow past the competition with proper setups, proper use of caching pages / actions / fragments, and tuned databases.
Don’t let the FUD fool you… learn how to make your applications scale, because they will.
Finally, where did the “300 users at the same time” number come from? It’s not realistic because the web is stateless. Once the request is done, the user’s not part of the equation anymore until they click something again.
Now, you want to slow your site down? Use lots of AJAX. More transactions will mean more load which means you’ll have to configure things differently.
I am more than happy to help with specific questions on how to load test / etc.
Definitely you can use caching. On the apps I'm currently working on,
the content is very dynamic, so rails' action and page caching won't be
very useful to me right now. However, I've been looking at memcached.
AWDWR made memcached seem like it was difficult to use and configure,
but it's really not. Configuring your app to use memcached for its
session store is especially easy. Ditto for caching models, which will
severely cut down your database CPU load and time waiting for queries.
I'm also getting into caching render-heavy sections of my index pages(
the current index page on the site I'm rewriting gets ~35,000 hits / day ).
If you're on a dedicated server for a single site, though, I don't think
that 300+ users (I'd use maybe 10 fcgis behind lighttpd) would be a
problem for rails at all.
yup, caching helps a lot. another big on is making sure your database
queries are well-written. Rails makes it easy to not think about the
database, but many times it's much better to use find_by_sql() instead
of using all the finder magic.
if it's any consolation - we have a pack of 10 mongrels serving around
415 req/sec all day long. We respond with pure XML (no rhtml
rending), so it's not comparable to a true "website", but you get the
idea - Rails can do it if you know how to tweak.
Ed Hickey
Developer
Litmus Media
816-533-0409
ehickey@litmusmedia.com
A Member of Think Partnership, Inc
www.ThinkPartnership.com
Amex ticker symbol: THK