Electronic – Compensating input impedance

diodesinput-impedanceleakage-currentoscilloscope

I'm trying to measure the reverse leakage current of some diodes with the method proposed HERE – the diode is connected in reverse bias to 5V through 1 megaohm and voltage is measured across the resistor, the current is calculated from the voltage. However I need to do this as accurately as I can, and as I understand the input impedance of the measurement device will distort my results greatly because it is close to the resistor's value. Instead of a voltmeter I'm using an oscilloscope (1M input impedance, 10x attenuation probe). My questions:

  1. How would one compensate the change in measured voltage due to the voltmeter input impedance?
  2. Is there any difference to doing this while using an oscilloscope?

If I'm doing something wrong here it would be nice to know as well. Any input is greatly appreciated!


EDIT: As I suspected, the way to go here is to consider the voltmeter/oscilloscope to be a resistor in parallel of the load to be measured – as Andy suggested. You should also check EM Fields' answer for measuring methods. HERE'S a well written paper about the effect of measurement device input impedance on your circuit.

Best Answer

If the input impedance of the measurement device is (say) 10 Mohms then you can assume that this 10 Mohms is in parallel with the 1 Mohm circuit resistance. This effectively lowers the 1 Mohm to 0.9090909 Mohms.

Try researching resistors in parallel.