Philip Hallstrom wrote:
Sorry, what I meant to say was that a single loading of their homepage
resulting in 13, 57, and 21 total requests, not that the site could only
support that many requests/sec.
-philip
Good point! Maybe I missunderstood the "experts" I've talked with. They
were talking about 1,500 req/s for a forum / chatting plattform. I was
told that they are also using a lot of AJAX and that the bottleneck is a
mySQL database. That's why they are now using a cluster. Probably that
figure included requests to static content like stylesheets, images, and
stuff as well.
That's a busy forum, but depending on how they use AJAX I could see it reaching that.
But I thought most of that is cached by the browser anyway? If the
clients are frequent users of that portal, wouldn't the browser avoid
loading that stuff and request only the dynamic content?!
Guess it depends... on our site most of the pages include their own CSS and specific images as the content is pretty different from section to section, but yeah, after awhile browsers should cash that.
But my point was that lighttpd/nginx/apache/etc should be able to serve thousands of requests per second for static content. So that really shouldn't factor into things unless you're youtube or flickr 
However I'm trying to avoid having havy database load with a intelligent
caching strategy. I hope that mongrel will support me on this way.
I'm not sure how mongrel would help specifically with this. Rails page caching and fragment caching would though. Look into memcache as well and the various plugins that tie memcache into AR's find methods and Rails caching in general. Memcache is a life saver for us.
At work I'm maintaining the source-code of a JSP-based web-application
used by about 80.000 active customers (meaning customers that log-in at
least once a month or so). I've had a look into Tomcat's log-files and
found 40 dynamic requests logged for one second somewhere at the
afternoon on one of the two machines.
So maybe 100..200 requests/sec. would already be ok for a, let's say
medium sized, web-application. If I'm lucky and the site get's even more
traffic, I guess I should still be able to serve 1,000 req. with more
server-hardware, more mongrels, and a lot of caching so that the
database won't get overloaded?!
Does ANYONE knows some more performance figures of rails based
web-sites? I would be interested in how many requests they can serve and
what type of hardware and software-setup they are using. Too bad the
author of "Agile web development with rails" stopped talking about these
figures after the first book.
Sometime last year our corp site did 8,996,175 pages and 63,571,374 requests in one day. That works out to um... about 100 pages/sec and 735 requests/sec.
And while I don't trust alexia exactly, for comparision with some other rails sites...
http://img312.imageshack.us/img312/3569/alexavd9.png
We did this with 20 servers running apache and 4 mongrels each, and three separate media servers (for video). None of the servers were overworked (load < 1) so I imagine we could have gotten by with a lot fewer, but the year before we were PHP and were slammed and our traffic triples about this time every year so we didn't want to take any chances 
-philip