DC Bias and Op Amp

dcoperational-amplifiervoltage divider

schematic

simulate this circuit – Schematic created using CircuitLab

I want to add a DC volatage of 1.25V to a sine wave signal. I am using a DC bias circuit with a voltage divider (R1=R2=100Kohm) connected to a 2.5V source and to the ground. That is giving me the correct 1.25V DC offset that I need in my application.

However I would like to lower the impedance so I am using a voltage follower op amp. I am using the LT6202 op amp which is supposed to be rail-to-rail and unity gain stable. When I connect the voltage follower to the DC bias circuit I mentioned above, the DC offset rises from 1.25V to 1.32V.I have done the simulation also and I am getting the same results.

When I am using R1=R2=10KOhm instead of 100KOhm the DC offset becomes 1.258V, which is closer to the wanted 1.25V value.

I am not sure what is causing that. Any help?

Best Answer

Nonideal opamps have input bias current - that is, they source or sink a small amount of current from their inputs. If you look at the LT6202 datasheet, you can see that for this opamp, with Vcm at half-rail voltage (which is roughly the case here), bias current will be between -7 and -1.3 microamps. This is sourcing current into your resistor divider and affecting the setpoint, equivalent to having a smaller resistor in the top half of your divider.

You can use lower value resistors to reduce the error - at the cost of more wasted current - or find an opamp with smaller input bias current.