Electronic – Voltage divider problem

impedance-matchingmeasurementvoltage dividervoltage measurement

I have 900k–100k resistor voltage divider, which works ok – scales down input voltage 10 times.
Now I want to offset attenuated value by 1.65, i.e. I apply bias voltage at the bottom of voltage divider.
The schematics is easy:

schematic

simulate this circuit – Schematic created using CircuitLab

Now if I apply 10 V on Vin I expect to see 2.65 V on the output, but in practice I measure 2.45 V, why is it so?

At first I thought maybe I have my probes set to x1, but no, on x10 I read the same voltage. Have no idea where are my 200 mV lost.

Other thought was that that 220 Ohm resistor is basically in the divider chain, so it is 3 resistor divider 900k, 100k and 200 Ohms, but I checked – since it is orders of magnitude less than other it should take 0.002 Volts, not 0.2.

The question is – where are 200 mV lost and how to make it "right" i.e. give output of 2.65 V?
Just to clarify without offset I get exactly 0.9999999 volts with the divider and x10 probe, I checked resistors multiple times, all of them are practically up to specs, <0.5% off.

Best Answer

You have forgotten that you are no longer dividing to ground. You are dividing to 1.65 V.

\$ V_O = \frac {(V_I - 1.65)}{10} + 1.65 \$.

At \$ V_I = 10~V \$ you get \$ V_O = \frac {(10 - 1.65)}{10} + 1.65 = 2.485~V \$.

For a simple (in)sanity check think what happens when \$ V_I = 1.65~V \$ and then when \$ V_I = 0~V \$.