Electronic – Measuring voltage across 1 ohm resistor

current measurementvacuum-tubevoltage measurement

This might be a very stupid question, but I can't seem to figure this out and I'm going nuts thinking about it.

Assuming you are trying to adjust the fixed bias of a tetrode push pull pair in a tube amp. In many amps, for convenience, the cathode will have a 1 ohm resistor connecting it to the ground, so you can easily use a multimeter to measure the voltage and thus the current flowing through the tetrode.

My confusion arises when trying to comprehend how the resistance of the multimeter probes themselves would contribute to this measurement.

I realize the probes are likely not thin enough wire or long enough to make any difference, but assume the resistance of each probe wire is also 1 ohm for this scenario.

Would it make any difference here?

Best Answer

If you were using an analog multimeter with, say, 20,000\$\Omega\$/V on a 0-150mV F.S. range, the meter looks like an approximately 3K resistor.

Thus the voltage across the 1\$\Omega\$ resistor will be a bit less since it is shunted by about 3K (3002 ohms if you count the leads) and the voltage making it to the 3000 ohm meter movement is a bit less as well (3000/3002 of the voltage across the 1\$\Omega\$ resistor) because 1/3002 of that voltage is dropped across each lead.

schematic

simulate this circuit – Schematic created using CircuitLab

In total then, in this example, the voltage read at the meter is lower than ideal by about 0.1%, which is going to be totally insignificant compared to the tolerance of the resistors and the accuracy of the meter.

A typical digital multimeter will have an input resistance of something like 10M\$\Omega\$ on the 199mV scale so the effect will be several thousand times less, so even with a very accurate resistor and meter it's negligible. Not to mention that the leads are probably closer to 0.2\$\Omega\$ than 1\$\Omega\$.