# Electronic – Why does the multimeter show a wrong voltage over a large resistor

diodes

I am having a hard time answering one particular question about our experiment.. In our experiment, R1 and R2 were set to 1Meg each and later to 10k… I understand the need for R1 and R2 a bit. Without R1 and R2, the voltage sharing wouldn't exactly be 50-50 for both D1 and D2 because no two diodes are completely identical. D1 and D2 will both have the same leakage currents (without R1 and R2) since they are just in series. However, they probably will have non-identical IV curves, so this particular leakage current will result to V@D1 /= V@D2.

The question I am having a hard time is that, why is V@R1 + V@R2 /= 10v when R1 = R2 = 1Meg?… One the other hand, those two voltages add up (to 10v) when R1 = R2 = 10k… I included the 60 ohm source resistance in my diagram for completeness. However, as I can see, both D1 and D2 are reversed biased and thus, they offer a very large (reverse resistance) which should be much greater than the 60 ohms. Even with the parallel combination of 1Meg and D1 reverse resistance, it should still be much greater than the 60 ohms. I tried thinking of an answer in terms of the RD1reverse//R1 = Req1 and RD2reverse//R2 = Req2. Req1 + Req2 (series) should still be much more than 60ohms and I thought that the 10v should still show up at the node of D1 cathode. Yet in our experiment, V@R1 + V@R1 < 10v.

Can anyone point me out if I am thinking this in a wrong way? Some tips/first step hint would really be appreciated

Edit: question answered thanks to @CL. Assuming D1 and D2 are open during reverse bias for simplicity and noting that Rmultimeter = 10Meg, V@R2 (shown on multimeter) = 10v * (1Meg//10Meg)/((1Meg//10Meg)+1Meg+60) = 4.76v measured.