How does division occur inside digital computers? What is the algorithm for it?
I have searched hard in google but haven't got satisfactory results. Please provide a very clear algorithm/flowchart for division algorithm with a sample illustration.
arithmetic-divisioncomputerstutorial
How does division occur inside digital computers? What is the algorithm for it?
I have searched hard in google but haven't got satisfactory results. Please provide a very clear algorithm/flowchart for division algorithm with a sample illustration.
Best Answer
Division algorithms in digital designs can be divided into two main categories. Slow division and fast division.
I suggest you read up on how binary addition and subtraction work if you are not yet familiar with these concepts.
Slow Division
The simplest slow methods all work in the following way: Subtract the denominator from the numerator. Do this recursively with the result of each subtraction until the remainder is less than the denominator. The amount of iterations is the integer quotient, and the amount left over is the remainder.
Example:
7/3:
Thus the answer is 2 with a remainder of 1. To make this answer a bit more relevant, here is some background. Binary subtraction via addition of the negative is performed e.g.: 7 - 3 = 7 + (-3). This is accomplished by using its two's complement. Each binary number is added using a series of full adders:
Where each 1-bit full adder gets implemented as follows:
Fast Division
While the slower method of division is easy to understand, it requires repetitive iterations. There exist various "fast" algorithms, but they all rely on estimation.
Consider the Goldschmidt method:
I'll make use of the following: $$Q = \frac{N}{D}$$
This method works as follows:
This method uses binary multiplication via iterative addition, which is also used in modern AMD CPUs.