Electronic – How calculate the input offset current effect on measured voltage

currentinputoffset

Here is the data-sheet: http://www.mccdaq.com/pdfs/manuals/PCI-DAS6034-35-36.pdf

A data-acquisition board's differential amplifier measures a constant voltage across a resistor R1 shown in the Figure below:

enter image description here

Ib+ and Ib- are input bias currents and their value is +/-200pA given at the data-sheet page 30.

I'm trying to quantify the error introduced by the input offset current in terms of voltage.

I mean ideally lets say the true value of the potential difference across the R1 is V1, in that case what would be the error introduced by the input offset current in terms of V1?

Best Answer

Input offset current is an input referred term that develops a voltage across the input resistors, and is usually calculated for a no-input condition (so a static analysis).

The input offset current is precisely what it says: the difference in input current between the two inputs. In this case, that difference in input currents will develop a voltage across the input resistance.

In your case, assuming Rdiff is very large compared with the sense resistor, you have a worst case offset current of 200pA * R1.

That yields a +/- 50nV voltage difference at the inputs of the amplifier. This will be multiplied by the gain of the amplifier to yield an output offset error.

See this article for a thorough discussion (search for offset).

Update: made notes about the effect in this circuit.

Your circuit is measuring a 4 - 20mA loop. Even at minimum current of 4mA through the sense resistor, the error due to input offset current is -146dB (or 0.05 parts per million, if you prefer). With such a small error relative to the current you intend to sense, it is a non-issue here.

You are doing the right thing, though, as input offset voltage, input offset current and input bias current are all sources of errors and they can be a cause of difficulty in high gain circuits.