mongrel memory usage

Why is 37 MB surprising? The ruby interpreter takes up nearly 30 of those (depending on your build). If you're looking to economize on memory utilization, consider serving multiple apps from a single mongrel cluster using the --prefix option.

BTW: I'm squeezing 3 mongrel instances, MySQL, Postfix/Dovecot and Apache with mod_proxy and PHP5 into 127 blocks or some 129 MB. How many pages/second are you trying to serve out of how many apps?

Paul Johnson-18 wrote:

Can you expand a bit on this? I didn't know it was possible to run multiple applications with a single cluster.

Thanks, Andre

You can read more about this option here: http://www.hackthat.com/.

It's also quite valuable to read up on Mongrel in general at: http://mongrel.rubyforge.org/faq.html.

My recommendation probably won't be popular, but I believe you need at least one Mongrel per Rails app; if I'm expecting some measure of concurrent requests, I'll start at two and move up if need be. The key is how fast your app can turn a request around, as Rails is one-in-one-out, so requests are processed serially unless you add mongrels.

Andre Nathan-2 wrote:

harper: This is possibly because of virtual memory usage. Earlier some of the memory the ruby process used might have been VM, now it has more memory to consume so it uses more physical memory. It probably didn’t change the overall memory footprint. Just a thought.

Vish

You can. Just start mongrels on multiple ports with the --prefix option (identical except for the --port) and use a s/w load balancer (pen is good) to use all of these. You might be able to get mongrel_cluster to make it simpler for you to manage all this but I haven't tried that out.

Vish

Does anyone have an example configuration of this? Sharing a single cluster among various domains would be awesome.

Thanks, Andre