I think the critical piece you're missing is that the result of require("foo")
is always the same object. Consider this REPL example:
> var myHttpModule = require("http")
{ ... }
> myHttpModule.someNewProperty = "someValue"
'someValue'
> require("http").someNewProperty
'someValue'
The second call to require("http")
did not re-create the http
module -- it summoned up the only http
module that exists in Node's current execution environment, which had been altered on the previous line.
So, in your example, there's only one Alpha module. When Beta and Gamma call require("alpha")
, they're each getting the same reference to the one-and-only Alpha module singleton object.
(Or, to be perfectly precise, they're getting the same reference to the exports
value of the Alpha module. require
creates a Module
object, but returns only the exports
property of that module. If you don't fully understand the distinction right now, it's not a big deal.)
Behind the scenes: what's really going on with require
caching?
The first time you use require
to include a module, Node actually runs the code in that module. The resulting module is stored in require.cache
. Subsequent attempts to load the module with require
first check for an already-loaded module object in require.cache
. (Note: environment-native built-in Node modules like http
are exceptions in that they don't use require.cache
-- you'll need to test out my code below with a custom module.)
require.cache
is an object whose keys are module file path, with associated values that are module objects. For example, assume some module named "foo
" in C:\node_modules\foo.js
:
> require.cache // empty cache
{ }
> require("foo") // require foo
{ bar: 'baz' }
> require.cache // cache now populated
{ 'C:\\node_modules\\foo.js':
{ id: 'C: \\node_modules\\foo.js',
exports: { bar: 'baz' },
parent: { ... },
filename: 'C:\\node_modules\\foo.js',
...
paths:
[ ... ] } }
The current value of the foo
module is in require.cache["C:\node_modules\foo.js"].exports
. We can use require.resolve
to get the file path of the module from the name, so we can express it also as require.cache[require.resolve("foo")].exports
.
If we call require("foo")
a second time, Node sees that require.cache[require.resolve("foo")]
is defined, and so it returns the value of require.cache[require.resolve("foo")].exports
instead of re-running the module creation code. This exports
property of the foo
module in require.cache
is the single instance of the foo
module export value, returned with every subsequent call to require("foo")
.
One interesting implication here is that you can delete require.cache[require.resolve("foo")]
to force a reload of the foo
module with the next call to require("foo")
, because the object describing the module is removed from the require.cache
object.
So what does this mean for me?
Without require
, you could still share values between modules using global variables. For example, your Alpha/Beta/Gamma case would work just as well with just Beta and Gamma setting and reading the global value global.a
. Instead, with Node's require
system, you're actually setting and reading global.require.cache[require.resolve("alpha")].exports.a
, which Node lets you read neatly as just require("alpha").a
.
In fact, if you just want to share a value and don't need to import your shared data as a module, you could also use a namespacing object, i.e., have Beta and Gamma set and read properties of the object global.alpha
. The major advantage to using require
is that you don't need to set up require("alpha")
external to the other scripts, whereas you would need to define global.alpha = {}
before require
ing Beta and Gamma. (Alternatively, you could conditionally define global.alpha
in each module based on a typeof global.alpha == 'undefined'
check instead, to see if it's the first module to use it.)
Treat Web services the same as any other data access:
To approach client side MVC architecture, without frameworks, we treated Web services in the same manner as we did localStorage and the local IndexedDB database. All code that involves requests to remote servers or that query a database or that read/write to localStorage happen in the model layer as DAO objects.
These were written using either the prototype pattern, the module pattern, or in some cases a combination of the two, using a modified factory pattern to treat objects as singletons in production but allowing us to "reset" them when unit testing so we keep tests self-contained.
They're generalized so that no application logic exists in this layer, and most of the code at this level can essentially be reused across future applications.
The services layer encapsulates the DAO layer, keeping the specific storage technology separate from the controller logic. Thus, pushing data to a remote server is seen as the same as inserting data into IndexedDB.
Used raw XMLHttpRequest to interface with Web services:
We chose not to use jQuery AJAX. Instead, we wrote a wrapper around XMLHttpRequest. There's no right or wrong answer here, but choosing not to use jQuery here allows us to stay focused on the rule of having no DOM manipulations in this layer. By wrapping the AJAX logic inside a prototype class, we set certain default headers, since most of what Web services deals with in our case is application/json data.
However, we did make an exception for jQuery Deferred. It didn't seem to make sense to write our own Observer implementation simply to be purists, and our logic behind this decision is that future professional programmers who work on this project with us have a better chance of understanding $.Deferred than some hand-rolled implementation. Many developers, on the other hand, we argued would have been exposed to XMLHttpRequest at some point in time, so being a purist here seemed less risky from the perspective of communicating to other developers what a piece of code does.
Analysis:
Earlier, I mentioned rules. We wrote up a series of "rules" designed to keep the code maintainable. For instance, rule #1 is that jQuery DOM manipulations only happen in a "view" layer. So the only jQuery we use in other layers is the $.Deferred object and nothing else.
We use $.Deferred throughout the application to keep certain logic in the layers where that belongs. For instance, we keep a persistent $.Deferred observer in the DOM click handlers in order to "notify" the controller that a click event has occurred so the controller can then delegate to other layers of the application to get/set, fetch/store, or perform other actions.
Using TDD as a development methodology, we've created small functions, where the largest is never more than a page, and most layers have a great deal of test coverage.
After a few months, the jury is still out on whether we should have used a framework, as much of what we've done could be generalized further into a reusable framework. We've learned quite a bit trying this, and I'm hoping this experience helps others who wish to develop client-side applications that involve a lot of asynchronous operations.
To give credit where it's due, I took inspiration from the following people:
For further reading on going framework-less, see Joe Gregorio's Zero Framework Manifesto
Best Answer
There are essentially two ways to do this. Which one is suitable for you depends on what you want to accomplish.
Case 1: A, B and C are loosely related, that is if e.g. B can be used without A and C. Having A and/or C available would extend the functionality of B.
Case 2: A, B and C are tightly related and some part of A, B or C relies on the existence of A, B or C. Together they form a logical unit.
If Case 1 fits your projects' purpose, then you want to build separate node modules and have the end user pull in those modules and use them at will.
If Case 2 fits better, then you want to compile A, B and C into one module e.g. ABC. You can still structure the source code (and exported API) of the ABC module according to A, B and C.
Building three separate modules and linking them dynamically makes little sense. In fact, they are dynamically linked (by node) when imported (
require
'd) in node-land. Linking them together statically might make more sense, bus is essentially just a messier way of accomplishing "solution 2" (producing a single module).Edit:
If A, B and C are three modules that are not authored by you, then you are dealing with a hierarchy. As you explain it in your hypothesis, the hierarchy of dependance is:
Thus A could be "owned" by B, and B could be owned by C. If you author C, then provide a user-facing API with an entry-point in JavaScript, where you
require
the three modules. Compile A, B and C as separate modules and have them live "inside" your compound module. The file system structure could look something like this:As the code in _C.node requires certain symbols to be implemented (i.e. the depending symbols from A and B), you will need to load the modules in order.
Note that I have not tested this, but in theory, this should work.