Electronic – Why is the Digital 0 not 0V in computer systems

computer-architecture

I'm taking a computer system design course and my professor told us that in digital systems, the conventional voltages used to denote a digital 0 and a digital 1 have changed over the years.

Apparently, back in the 80s, 5 V was used as a 'high' and 1 V was used to denote a 'low'.
Nowadays, a 'high' is 0.75 V and a 'low' is around 0.23 V.
He added that in the near future, we may shift to a system where 0.4 V denotes a high, and 0.05 V, a low.

He argued that these values are getting smaller so that we can reduce our power consumption. If that's the case, why do we take the trouble to set the 'low' to any positive voltage at all? Why don't we just set it to the true 0 V (neutral from the power lines, I guess) voltage?

Best Answer

You are confusing the "ideal" value with the valid input range.

In usual logic, in ideal conditions, the logical zero would be precisely 0V. However, nothing is perfect in real world, and an electronic output has a certain tolerance. The real output voltage depends on the quality of wires, EMI noise, current it needs to supply etc. To accommodate these imperfections, the logic inputs treat a whole range of voltage as 0 (or 1). See the picture in Andy's answer.

What your lecturer probably meant by 0.75V is one of the points making the logical 0 range.

Note there is also an empty range between 0 and 1. If the input voltage falls here, the input circuit cannot guarantee proper operation, so this area is said to be forbidden.

Related Topic