Nginx – Load balancing and failover for streaming solutions

load balancingnginxPROXY

I have a content distribution system which most of the time delivers VOD and audio. However, recently we have also moved into the live-streaming business. This however has raised a few issues regarding load balancing.

So far we have delivered content using nginx as load-balancer (RR), with the application server giving a 503 when not available (too much load/maintenance etc). We could handle this on one server, and as such using the nginx proxy only caused extra 'internal' data transfer.

However, now we need to move to multiple separate physical servers. As we are going to need HLS (Http Live Streaming) which is based on chunks of 10 seconds video files using a proxy as we currently do will not work anymore. We need to be able to server around 1000-1500 connections, at a max of 1 Mb/s per connection. If we use a proxy it would mean that the proxy needs to get the files from the file server, which means huge extra overhead between servers. Even if the files are stored on the proxy, one server would not have enough bandwidth for a 1000 connections, and possibly not the power to server that many request per second.

So what would be the normal way to solve such a problem?

Is it possible to redirect using a 'proxy' to another server, so the traffic does not go through the proxy anymore? And doing this while keeping a central node to monitor load and distribute to the end nodes? I would like to maintain the domain name in the request as it is used to choose what content to server.

Best Answer

The normal way to solve this problem would be to use a CDN, such as Level3 or Akamai. Both of these support HLS and can offload a significant proportion from your origin.

The CDN acts as a proxy to your origin and you can push (upload) content to it or you can set it up to pull from your origin. In your case, you would probably want to set up your server to push content to the CDN as it is generated.

-- ab1

Related Topic