Electronic – Comprehension problems with op-amps


The common differential amplifier takes two signals and amplifies the difference between these two signals. I am aware that this difference usually is around a few millivolts or microvolts so, to put it in engineering terms, it is zero.

Everywhere here, in the text books, or internet, the mathematical calculations are based on the fact that the voltages at the inverting and non-inverting terminals are exactly equal to each other, so conceptually we should have a zero output voltage, and that's not the point, because this circuit is going to amplify nothing.

Am I fundamentally wrong here?

Best Answer

Typical opamps have DC gains in the 10^5 to 10^6 region. This means if the output is within the rails, say 0 to 5v, or +/- 15v, then the input will indeed be measured in microvolts.

As you say, this is sufficiently close to zero to be deemed to be zero for many purposes.

One of the purposes for which input=0v is a good enough approximation is when solving for the DC gain of a fed back amplifier. Typically, several resistors will draw current from input and output voltages measured in volts, and sum their currents at one of the amplifier input terminals. Whether that terminal has a voltage of 0uV, or 10uV, is irrelevant for most purposes, as the error is parts per million.

Typically, amplifier input offset errors will be measured in mV, so for accurate systems, we have to worry about input offsets long before we have to worry about whether the input voltage is really zero or not.

For an ideal opamp, where the gain is infinite, then the input voltage is zero, theoretically.