# Electronic – we use 3.3 V instead of 3 V?

voltage

Why do we use 3.3 V in circuits? Why not 3 V? Is there a specific standard? Or do we use it because it was 3.3 V from the beginning?

I am just curious to know.

Why do we use 3.3V in circuits? Why not 3?

Why not $$\\pi\$$ V? Why not 2.8 V, a much nicer number than 3 V? The more things like power consumption or speed matter, the less you align with "human-pretty" numbers, and more with physical needs.

In this case, the physical need is actually "something slightly above 3V, but less than 5V, to save power in our new LVCMOS circuits", ca 1970.

Point is: when TTL (transistor-transistor logic) was still the dominant technology for integrated logic, supplies lower than ca 4.5 V were impossible, due to large collector-emitter voltages in the bijunction transistors used there. Hence, with a bit of headroom, 5.0V became a standard.

Now, CMOS was introduced, and it could work well with supply voltages down to ca 3 V. People wanted to give a little headroom.

So, my guess here is why it's 3.3V, and not 3.15V or 3.4 V: they picked a voltage for which there were already voltage regulators ready, in the drawers: 3.3V, which had, interestingly, already been (one) supply voltage of the Apollo Guidance Computer, so NASA and early semiconductor companies had poured in money into building these.

TL;DR: > 3 V: Need that, 3V is just a tiny bit too low for reliable CMOS logic gates at the time of invention. 3.3 V: probably because hardware for that voltage already existed in the early 1970s.