Minimization of Logical Equations

digital-logic

I have been giving a problem were I have to design 1-bit subtractor, which performs single bit binary subtraction. It has three inputs: the digits ai, bi (bi is subtracted from ai) and the borrow from the previous 1-bit subtractor Bi-1. The outputs are Di (the difference) and the borrow Bi from the next 1-bit subtractor. Present all four stages (truth tables of the both functions, sum of products, minimization and logical circuits). During minimization follow the same approach that we used in class for 1-bit adder design.

I have search around, but I still cannot figure out how to approach this problem. If anyone could shed some light it would be greatly appreciated.

Best Answer

I'm not going to tell you too much; you show too little of your own efforts. Just a few basic tips to get you started.

You'll have to start with the truth table. With 3 inputs you have only 8 combinations. Start without the borrow bit (the half of the table for which it's zero). Then subtracting shouldn't be too hard: what's the result if you subtract 0 from 1? Or 1 from 1? If you want to introduce the borrow bit it could be useful to do the subtraction of two decimal numbers, that should be familiar since elementary school. Observe how the result is different with or without borrow. Keep it simple: 25 - 13 is without borrow, 25-17 is with borrow. That should get you started.