Programming Languages – Why Is Mod (%) a Fundamental Mathematical Operator?

historymathprogramming-languages

Is there a reason, historical or otherwise, why the modulus operator is part of a small set of standard operators in what seems like many languages? (+, -, *, / and %, for Java and C, with ** in Ruby and Python).

It seems strange to include mod as a "fundamental" (not to knock it, I use it plenty, but I also use exponentiation, absolute value, floor/ceiling or others — they seem just as useful and necessary). Was this an old decision made in some specification which Java, C, Ruby and Python all follow or a language they are all descended from? As far as I can tell most Lisp dialects only include +, -, / and *.

At first I wondered if mod was particularly easy to implement at the binary level (would that even make a difference, regarding decisions about what should be a "fundamental" operator and what shouldn't?) but it seems not to be. Is it just much more commonly used in programming than I think?

Best Answer

I am sure it is common because many CPU architectures implement modulus as a second output of the integer divide instruction.

I don't recall it being present in 1970s CPUs (6800, 8080, Z80, 1604, etc.), but by the 1980s, the Intel 8086 and 8088, as well as the Motorola 6809 had it.

The PDP-11 instruction architecture specified DIV producing a quotient and a remainder from the beginning (1970), though the MUL and DIV instructions were not present on early designs, but could be transparently emulated by an "instruction not implemented trap" and implemented with a handler that did bit twiddling. Probably the PDP-11 feature encouraged the very first edition of the C language providing the % feature. (Ever notice how a percent sign has a slash in it? That makes it a cleverish choice for a division related operator.)

The presence of modulus in C alone can probably explain its presence in all modern languages. C has a very large family of descendants and was otherwise quite influential.