Java – Is Saying ‘There Are No Threads in Node.js’ Correct?

javanode.js

Perhaps I am getting something wrong here but I had a conversation today that left me perplex.

I hear a lot about nodejs not having threads.
but that's not entirely true, right?

The precise phrasing is – you cannot spawn (or manage) threads in nodejs.
Threads are being managed for you.
The underlying implementation in C that gives us the event queue and all has to have threads. right?

And if so, can someone please compare these threads to a java environment by comparing their resource consumption on the server – as I also hear a lot about how nodejs is much less consuming than java.

Best Answer

Comparing Node.js to Java is somewhat like comparing apples to oranges and more like comparing a dictionary to a grammar textbook (thanks scriptin).

Node.js is a quasi-framework written in JavaScript using Google's V8 JavaScript engine to build the JavaScript code into native machine code that can be used server-side (instead of interpreting the JS code in real-time).

Java is a language (not a framework) to which an implementation of the JDK (i.e. Oracle JDK vs. OpenJDK) can see fit how to implement the underlying language constructs (threads, I/O, etc.).

Node.js does not have a threading model, that is, you cannot spawn a thread directly through any API Node.js has. This is because it utilizes JavaScript (i.e. ECMAScript) which does not support multi-threading (since JavaScript was not originally meant for server side development). This does not mean that it is not multi-threaded. On the contrary, Node.js uses a thread pool to manage a lot of the async-I/O and event system that you (the user of Node.js) can hook into (just like JavaScript on the client). So as you stated:

The precise phrasing is - you cannot spawn (or manage) threads in nodejs.

Yes.

can someone please compare these threads to a java environment by comparing their resource consumption on the server

In a Java environment, you can spawn threads in user code, but it should be noted that the underlying mechanisms that spawn the threads (i.e. the JVM C code) are the same system API's called to create and manage the threads in Node.js. In other words, both will call pthread_create for POSIX platforms or CreateThread for Windows. In this way, the resources used are the same on that front.

Where this thread model and resource usage differs is when a user wants to create more than 1 thread; multiple threads might mean context switches which can incur additional resources. This is true of any language though, since threading is an OS level concept and not something a language need support, so if you (as the developer) are building a multi-threaded application, you would (should) be aware of these caveats and build your code accordingly (i.e. handle the resources/threads such that the application has as little context switching as feasible).

Continuing from that, you state

as I also hear a lot about how nodejs is much less consuming than java.

This question cannot be directly (read, easily) answered since a simple Node.js app that were ported to Java (or vice versa) could result in varying resource usage based on a slew of different factors and ultimately you would have to do the testing yourself to see which would be better for that scenario.

To this though, the "less consuming" quote is probably referring more to the fact that running a Java based applet would require the Java Virtual Machine and the Java environment, which do consume substantial amounts of resources (for different reasons) compared to Node.js (which runs as a "native" app), but just because something consumes more resources does not mean it's "worse". And this is why additional testing would need be done to confirm.

I hope that can help add clarity.

Related Topic