Nginx – Building a High Performance Static Website

cachehtmlnginxperformancevarnish

I'm looking to build a High Performance website. It has thousands of static HTML pages, that are specifically rendered depending on a form submit. I have a ruby script that generates these static HTML Pages and stores them on the server.

Now I'm looking at 1000+ concurrent users on the site. Which is the fastest way to serve these users. I believe Nginx + Varnish can do an extremely good job for this kind of a scenario. Are there any further optimizations that I can do?

Is there a way instead of NGinx + Varnish hitting the disk for the HTML pages, it hits the RAM instead. Using Memcached somehow.

I'm already considering moving out the other static assets likes Images/Stylesheets to a CDN. Please advise on what's the best way to go about this.

Thanks!

[Reposted from StackExchange : https://stackoverflow.com/questions/6439484/building-a-high-performance-static-website]

Best Answer

While varnish is absolutely fantastic with it's flexible VCL, it's really more suitable for caching dynamic websites. There seem to be general consensus about nginx outperforming varnish (atleast on small static objects).

You can either proxy_cache, fastcgi_cache or simply serve from disk directly using nginx. I know it do support memcached, but the only benefit with memcached would be if you have multiple servers sharing the same cache - apart from that I can only see an extra overhead.

You could either let your filesystem (and hopefully raid controller) cache the (most used) data, or just stick it into a ramdisk!

I am confident that a pretty budget xeon server with a few GB of ram easily will handle a few thousand requests per second, given that you're really only serving static content. I also think it might be possible to pre-compress all static content so you won't add that extra ovearhead for every request.