Design Patterns – Identifying and Describing Anti-Patterns

anti-patternsdesign-patternsdsl

The case where a source-level operator actually describes an operation to take place at some future point, thunking the real operator together with its operands.

I don't know if this has any kind of general name, or perhaps it's too low-level a design element to be thought of as a "pattern" at all, but anyway it's an element that is frequently seen in a number of situations:

  • If I understand correctly (not a frequent C++ user), Boost::Phoenix does this to represent its lambdas and lazy operations, e.g. overloading + to actually capture the arguments along with the real + function for their type
  • Parser combinators in functional languages (or something like Spirit) look like operators on data, but actually build a parser that runs later
  • Other EDSLs (in languages with operator overloading) seem to do this a lot, e.g. functional reactive programming libraries tend to invisibly build procedures that will do what the operation looks like it's doing right now
  • (Again, if I understand correctly) IO in Haskell builds an imperative computation out of pure components that is then "executed" by dark magic hidden in the language runtime, while looking like it's just executing the functions in-place
  • At a simpler level, Greenspunning an embedded mini-Lisp into a C program could be considered the same thing: S-expressions can be built out of calls to C functions, to be run later (this last one would probably usually be considered the antipattern case), letting you look like you're passing "a lambda" to a C function, when you're really passing interpreter data

It's obviously not an antipattern in all cases, as in some of those it's the only way to use the tool in question (doing it in C might earn one a beating though). I don't think it's "Greenspunning" or "using an EDSL" by itself, as it can be more generic and widely applicable than that. I don't think it's the Interpreter pattern, although I may not have correctly understood that one: the computation is being built directly by executing host language code (so it effectively exists at compile-time, but usually not as a first-class language construct), rather than parsed out of a string or other runtime loaded data.

In all cases, what seems to happen is that an operation looks like it's doing one thing (not necessarily very convincingly, if it's C and you need to write out Add instead of overloading +, but same idea), while actually packaging that action up for later consumption. Something like a "computation builder" pattern? But I haven't found that term in use.

I don't have a practical problem to solve here; it's just bugging me that there seems to be a common design element that I can't put a name to.

(Question originally posted at StackOverflow, moved by hand)

Best Answer

What you seem to be describing is Lazy Evaluation. Computations to be performed when the result is needed, rather than when it appears in the source code.

In Haskell, this is done by hiding these computations behind monadic abstractions. In C++, the abstractions are similar, but more explicit, and partially hidden behind overloaded operators and expression templates.

Similarly, your cited example of "Greenspunning" of a Lisp-like construct in C is not an antipattern, but rather a data-driven lazily-evaluated DSL, and seems, actually, to be one of the most common patterns of an FFI for Lisp and Scheme interpreters (especially in ECL, Chicken, tinyscheme and others), since C lacks the means to construct and pass unexecuted functions as data.

You will notice that these patterns do not appear in languages which have first-class function objects.

Related Topic