Electronic – Why is there a difference in the voltage drop across the same resistor when using an oscilloscope and a multimeter

multimeteroscilloscoperesistorsvoltage-drop

I am measuring the voltage drop across a 1 ohm resistor within a circuit with an oscilloscope and a multimeter.

The impedance of the oscilloscope is 1M ohm and the impedance of the multimeter is 10M ohm.

I am reading approx 5mV when using the multimeter and approx 2mV with the oscilloscope.

Which of these two values would be most accurate, if any and if not, how would I go about ensuring the voltage drop measured across the resistor is accurate?

Edit –

Accuracy of equipment:

  • Oscilloscope

    • [3%×(|reading|+|offset|)+1%×|offset|+0.2div+2mV], ≤100mV/div
  • Multimeter

    • 1.0% + 3 counts

Settings used:

  • Oscilloscope
    • 10mV / div
  • Multimeter
    • mV setting with resolution of 0.1mV

Best Answer

I would normally trust a DVM over a oscilloscope for accuracy but given the published specifications we have:

DVM: reading 5mV, mV range resolution 0.1mV. 3 counts is therefore 0.3mV.

Specification states reading is between:

\$ 0.99 \times \text{actual} - 0.3\text{ mV} \$ and \$ 1.01 \times \text{actual} + 0.3\text{ mV} \$

Rearranging this we find actual is between:

\$ \dfrac{\text{reading} - 0.3\text{ mV}}{1.01} \$ and \$ \dfrac{\text{reading} + 0.3\text{ mV}}{0.99} \$.

Making it between \$ 4.6534 \text{ mV} \$ and \$ 5.4536 \text{ mV} \$

Now for the oscilloscope we have an offset which is where 0V is on the screen. Let's assume it is in the middle making \$ \text{offset} = 0 \text{ mV} \$

We have a reading of 2mV, 10mV per division making 0.2 divisions 2mV.

This gives us an actual output voltage between.

\$ 0.97 \times{reading - (2+2)\text{ mV}} \$ and \$ 1.03 \times{reading + (2+2)\text{ mV}} \$.

That is between \$ -2.06 \text{ mV} \$ and \$ 6.06 \text{ mV} \$