Math Error Handling – Will a Computer Attempt to Divide by Zero?

arithmeticerror handlingmath

We all know 0/0 is Undefined and returns an error if I were to put it into a calculator, and if I were to create a program (in C at least) the OS would terminate it when I try to divide by zero.

But what I've been wondering is if the computer even attempts to divide by zero, or does it just have "built in protection", so that when it "sees" 0/0 it returns an error even before attempting to compute it?

Best Answer

The CPU has built in detection. Most instruction set architectures specify that the CPU will trap to an exception handler for integer divide by zero (I don't think it cares if the dividend is zero).

It is possible that the check for a zero divisor happens in parallel in hardware along with the attempt to do the division, however, the detection of the offending condition effectively cancels the division and traps instead, so we can't really tell if some part of it attempted the division or not.

(Hardware often works like that, doing multiple things in parallel and then choosing the appropriate result afterwards because then each of the operations can all get started right away instead of serializing on the choice of appropriate operation.)

The same trap to exception mechanism will also be used when overflow detection is turned on, which you ask for usually by using different add/sub/mul instructions (or a flag on those instructions).

Floating point division also has built in detection for divide by zero, but returns a different value (IEEE 754 specifies NaN) instead of trapping to an exception handler.


Hypothetically speaking, if the CPU omitted any detection for attempt to divide by zero, the problems could include:

  • hanging the CPU (e.g. in an inf. loop) — this might happen if the CPU uses an algorithm to divide that stops when the numerator is less than the divisor (in absolute value).  A hang like this would pretty much count as crashing the CPU.
  • a (possibly predictable) garbage answer, if the CPU uses a counter to terminate division at the maximum possible number of divide steps (e.g. 31 or 32 on a 32-bit machine).
Related Topic