Voltage drop when using voltmeter


it's a theoretical question.
is there any voltage drop when measuring a 12v battery with a multimeter(voltmeter mode)?
so if we get 12 Volts reading near the battery, does this changes if we extends the cables of multimeter to 1 km ??

Best Answer

A typical DMM has a very high (but not infinite) input impedance, typically ~10Mohm or bigger.

Now suppose you have very long leads. This will also have some finite resistance. Forming a voltage divider (I'm moving all of the resistance due to leads to above the multimeter. Mathematically this is equivalent to having two leads with 1/2 the length on each side):

\begin{equation} V_{out} = \frac{R_{DMM}}{R_{DMM} + R_{leads}} V_{in} \end{equation}

Computing the equivalent lead resistance for 1km 24AWG wire on each side, we get \$R_{leads} = 166 \Omega\$.

Then with a 10Mohm dmm resistance, \begin{equation} \frac{V_{out}}{V_{in}} = 0.9999834 \end{equation} Or an error of 0.00166%. You'd be pretty hard pressed even measuring this error with most multimeters, and errors from other sources will swamp any errors due to the voltage drop in the leads.

There is slightly more error if you add in the battery's internal resistance, but still not significant.