Electronic – Why is it a bad idea to divide reference voltage only with resistors instead of using an op-amp buffer

referencevoltage

I'm planning on shifting the output of an AD8226 instrumentation amplifier by 'about' 0.5 volts using a 2.5 v precision reference and a voltage divider. According to the datasheet this is a bad idea and a buffer must be used:

enter image description here

I understand that if I don't use a buffer I'll probably end up with a slightly different voltage shift and a slight increase in gain. I'm guessing these can be calibrated out by software without a problem. If yes, then why should I use a buffer? Also, there is a note at the last line about a degradation in CMRR. I'm using the device for reading a 0-10 v (very slowly changing) signal in an industrial environment (single ended). Will the CMRR degradation be significant?

P.S. Voltage divider used has two resistors 20k and 4.7k.

Best Answer

Because "for best performance, the source impedance to the REF terminal should be kept below 2 [ohms]".

If you made a resistive divider to meet that requirement, it would be 4 ohms in each arm, or 8 ohms between power and ground, which would likely consume a lot more power than you'd want to use for this function.

Why does the impedance need to be below 2 ohms?

It's explained in the text you posted:

enter image description here