Object-oriented – How functional programming achieves “No runtime exceptions”

functional programmingobject-orientedobject-oriented-design

How does a functional programming language, such as Elm, achieve "No runtime exceptions"?

Coming from an OOP background, runtime exceptions have been part of whatever framework that is based on OOP, both browser-based frameworks based on JavaScript and also Java (e.g., Google Web Toolkit, TeaVM, etc. – correct me if I'm wrong though), so learning that functional programming paradigm eliminates this is big.

Here's a screen grab from NoRedInk's presentation on Elm, showing the runtime exceptions from their previous JavaScript-based code to the new Elm codebase:

Enter image description here

  • How does the functional paradigm or programming approach eliminate runtime exceptions?
  • Are runtime exceptions a great disadvantage of OOP over functional programming?
  • If it is such a disadvantage, why have the OOP paradigm, programming approach, and frameworks been the industry standard? What is the technical reason? And what's the history behind this?

Best Answer

How does a Function Programming, such as Elm, achieve "No runtime exceptions"?

That's easy. You simply don't write functions that fail.

That might sound simplistic, but that's the gist of it.

Take division, for example. We can simply define that anything divided by 0 is 42. Boom. Now, division no longer throws a runtime exception, it just sometimes returns a wrong result.

Elm's choice are a little bit more intelligent than that, however:

 1 / 0
--=>  Inf

-1 / 0
--=> -Inf

 0 / 0
--=> NaN

Note that this is only one possibility. Another possibility would be to introduce a "nonzero number" type that is distinct from "number", and then the type of / would be

(/) : Float -> NonZeroFloat -> Float

instead of

(/) : Float -> Float -> Float

as it is now.

A third possibility would be change the return type, for example like this:

(/) : Float -> Float -> Maybe Float

This means that the function returns "maybe a float". More precisely, it will either return Just Float or Nothing.

Or, if you want some more information, the type could be

(/) : Float -> Float -> Result String Float

This will return either an Ok Float with the value wrapped into the Ok data constructor or an Err String with a description of the problem wrapped into the Err data constructor.

Another example is retrieving a value from a dictionary: what if the key does not exist? Or indexing into an array: what if the index does not exist? Well, both the get function for arrays and the get function for dicts return a Maybe. In some other languages, there is also an additional function called getOrElse which takes an additional argument, and returns that argument if the key is not found.

The key point is simply to write your functions in such a way that they never throw an exception and always return a value.

Note that this has nothing to do with Functional Programming. You can do this in any language. For example, C also has no runtime exceptions. In C, you use "magic" return values or error codes to signal errors.

You could do this in Java as well. In fact, Java ships with an implementation of that Maybe type called java.util.Optional and you can write a similar Result type as well.

Go has multiple return values, and it is customary to return an additional error code value from a function. For example, a hypothetical get function for a dictionary would not return item and then maybe return null or crash if the item cannot be found, but rather it would return item, found, where found is a boolean value telling the caller whether the item was found, and you would use it something like this:

item, found := dict.get("key")
if (found) {
  // do something with `item`
}

Coming from a OOP background, runtime exceptions have been part of whatever framework that is based on OOP, both browser based frameworks based on Javascript and also Java (e.g. GWT, TeaVM, etc), correct me if I'm wrong though, so learning that Functional Programming paradigm eliminates this is big.

It has nothing to do with Functional Programming. FP certainly helps but is not a requirement.

If you write your Java code in such a way that you never return null, never have un-initialized fields, and never throw exceptions, then you can achieve the same thing for your own code. The problem is, of course, that everybody else's code, including the Java SE standard library, still returns null and throws exceptions. So, it is as much about the standard libraries and the discipline of the community as it is about the type system and the language.

Of course, there are things the type system and the language can do to help you. For example, it can do exhaustiveness checking, i.e. it can make sure that you always check for both Ok and Err in your code. This is not possible in Java, for example. But again, this has nothing to do with Functional Programming or Object-Oriented Programming.

Haskell is actually a very good example: Haskell is a Functional Language and it does have runtime exceptions, but the community simply chooses to never use them, but use Maybe, Error, etc. types instead.

  • How the Functional Paradigm or programming approach eliminates runtime exceptions?

It doesn't. Writing code such that it never throws exceptions eliminates exceptions.

  • Are runtime exceptions a great disadvantage of OOP over Functional Programming?

They have nothing to do with OOP or FP.

  • If it is such a disadvantage, why is OOP paradigm, programming approach and frameworks have been the industry standard?

I would argue they are not the industry standard. While a lot of code is written in Java, C#, etc., the overwhelming majority of that code is not Object-Oriented but rather Structured / Procedural / Modular with Abstract Data Types.

What is the technical reason?

Most "popular" technologies are not popular for technical reasons. At no point in history were DOS and/or Windows technically superior. They just had brilliant marketing and business-savvy managers.

And what's the history behind this?

Unix becomes popular, with Unix comes C, C becomes popular even outside of Unix, C++ adds a misunderstood mangled castrated idea of OOP to C, C++ becomes popular, Java kinda-sorta looks like C++ even though it is actually much closer to Objective-C and Smalltalk, IBM goes all-in on Java, and there is the universal truth of IT: "Nobody ever got fired for buying IBM."

Is this cynical? Yes, but that doesn't make it untrue. More often than not, the people who decide whether or not to buy some technology do not have the technical expertise to judge whether the technology is actually good or not.