Does Node.js Increase Scalability?

node.jsscalabilityserver

I've been reading about the C10K Problem, and of particular note is the part that refers to asynchronous server I/O. http://www.kegel.com/c10k.html#aio

I believe this pretty much summarises what Node.js does on the server, by allowing threads to process user requests whilst relying on I/O interrupts (events) to notify threads of jobs that are completed, rather than have the thread be responsible for the full CPU job. The thread can get on with other things (non-blocking), and be notified of when a job is done (e.g. a file is found or a video is compressed).

This subsequently means that a thread is more 'available' to sockets and therefore to users on the server.

Then I found this: http://teddziuba.com/2011/10/straight-talk-on-event-loops.html

The writer here claims that although the event driven framework (interrupted threading), may free up threads, it doesn't actually reduce the amount of work a CPU has to do! The rationale here is that if, say, a user requests to compress a video they uploaded, the CPU still has to actually do this job, and will be blocking while it does it (for simplicity's sake, lets forget about parallelism here – unless you know better!).

I'm a straightforward coder, not a server admin or anything like that. I'm just interested to know: is Node.js a gift from the gods of 'cloud computing' or is it all hot air, and won't actually save companies time and/or money by improving scalability?

Many thanks.

Best Answer

Of course any CPU bound work is going to utilize the CPU. It's going to block the CPU in whatever language or framework you write it in.

Node.js is great for when you have I/O bound work, not CPU bound. I wouldn't do heavy lifting in Node, though it can be done. Node.js solves real problems, not fictional or imagined ones like fibonacci number servers. It's not “hot air”.