Subtract Diff-Signal (Millivolts range) on top of common mode voltage (volts) with opamp – very inaccurate

common-modedifferentialoperational-amplifier

I built a subtractor circuit as shown here with 10k each resistor. Now I want to measure current by using a shunt between the two inputs and letting a current flow through it, see here, figure 4, the only difference is that my reference voltage is not GND but 8V.

The voltage on the shunt is in the 10mV region, the common mode voltage swings from GND to about 20V.

Ideally this should happen: e.g. (10mA current, shunt resistor 1 Ohm) 10mA x 1 Ohm = 10mV at shunt. Let R_Load be 100 Ohm-> 100 Ohm x 10mA = 1V. So we would have on the lower end of the shunt 1V and on the upper end 1.01V, difference is 10mV. Since all Resistors in the difference amplifier (DA) are equal and reference is 8V we should get at the output 8V+10mV, right?
If we changed the load to say 1k, we would have a common mode voltage of 10V at 10mA and still the output of the DA should still be the same, 8.01V, right?
But this is not the case, but different CommonModeVoltages (CMVs) result in drastically different output voltages (means, the difference at the shunt is not multiplied by one for all CMVs but alters heavily).

Why is that?

I suspected, that the common Mode Rejection of the Opamp was not good, but:
Using a TL082 we have 80dB CMRR, means 10^4 = 10000, which should mean, that a common mode voltage added to the difference signal should have an effect of 2mV for a CMV-Swing of 20V, right?

However the difference at the output is way larger (uselessly large).
Does anyone have an explaination for that?

When I used a INA122 instrumentational amplifier, the problem was totally solved. Still I don't understand the problem.

Best Answer

Consider the effect of a small mismatch between the resistor values (actually the mismatch between two resistor ratios is what is important).

If you used 1% resistors, the non-inverting input can be 1% off worst case, and the feedback likewise, so you could see an error of as much as 2% of the common mode voltage. At 20V, that's 400mV, or 40x your entire full-scale signal. Chances are it would be somewhat less in practice, but probably more than 1000% error.

While you could buy very expensive 0.01% resistor networks you'd still have a 4% error and even a tiny shift with temperature would cause a huge error in the output. That's why that kind of differential amplifier is generally not used when the signal is small compared to the common-mode voltage- the sensitivity to component tolerances is far too great.

You could use the classic three amplifier in-amp configuration and avoid buying a commercial part- the key is to have enough amplification in the first stage (R1/Rgain) that the resistor ratios in the output amplifier are not too critical. Also, watch for saturation.

enter image description here

Also, this is not a huge source of error, but note that the amplifier inputs have some current draw. If you put the inverting on the low side of the shunt the load looks like 20K to ground (1mA) so it will measure that current (but not the 1mA the other input draws since it doesn't go through the shunt). The instrumentation amplifiers have high-impedance inputs so that error current will likely be maybe 5-7 orders of magnitude less.