Why isn’t OCaml more popular

cocaml

I've always heard that C is the language of choice to use for embedded systems, or anything that needs to run at maximum speed. I never developed a fondness for C, mostly because I don't like pointer arithmetic and the language is barely a rung above assembler.

On the other hand, ML languages are functional, garbage collected languages, and OCaml even has an object model, yet they have a reputation for being as fast as C. ML languages have the abstraction anyone could ask for to write high-level, concise code, yet it retains the speed necessary for writing high-performance applications.

OCaml in particular can be used anywhere that C is traditionally used, such as for embedded devices, graphics drivers, operating systems, etc. By all rights, OCaml should have taken over the world by now, but hardly anyone heard of the language yet alone used it.

This is a subjective question, but why have OCaml and ML other languages remained so obscure, while C and other languages became popular?

Best Answer

The first answer is that nobody really knows why languages become popular, and anybody who says otherwise is deluded or has an agenda. (It's often easy to identify why a language fails to become popular, but that's another question.)

With that disclaimer, here are some points that are suggestive, most important first:

  • The first mature C compiler appeared in 1974; the first mature OCaml compiler appeared in the late 1990s. C has a 25-year head start.

  • C shipped with Unix, which was the biggest "killer app" of all time. For a long time, every CS department in the world had to have Unix, which meant that every instructor and everyone taking a CS course had an opportunity to be exposed to C. OCaml and ML are still waiting for their first killer app. (MLdonkey is cool, but it's not Unix.)

  • C fills its niche so well that I doubt there will never be another low-level language devoted only to systems programming. (To see the evidence in favor, read Dennis Ritchie's paper on the history of C from HOPL II.) It's not even clear what OCaml's niche is, and Standard ML's niche is only a little clearer. So Caml and ML have quite a few competitors, whereas C killed off its only competitor (BLISS).

  • One of C's great strengths is that its cost model is very predictable: it is easy to look at any small fragment of C code can instantly get an accurate idea of what machine operations will have to be performed to execute that code. OCaml's cost model is much less clear, especially because memory allocation is much less explicit, and the overall cost of memory allocation (equals cost of allocation plus costs incurred during garbage collection) depends on emergent properties like how long objects live and which objects refer to other objects. The net result is that performance is hard to predict, and even hard to analyze after the fact. (OCaml's memory-profiling tools are not what they should be.) As a result, OCaml is not good for applications where performance must be very predictable---like embedded systems.

  • C is a language with a standard and many compilers. OCaml is a software artifact: the only compiler is from a single source, and the compiler is the standard. And that standard changes with every release. For people who value stability and backward compatibility, a single-source language may represent an unacceptable risk.

  • Anybody with a halfway-decent undergraduate compiler course and a lot of persistence can write a C compiler that more or less works, and with adequate performance. To get an implementation of OCaml or ML off the ground requires a lot more education, and to get comparable performance to a naive C compiler requires a lot more work. This means there are a lot fewer hobbyists to mess around with languages like OCaml, so it's harder tor the community to develop a deep understanding about how to exploit it.

Related Topic