An ideal voltmeter (which does not exist in real life) has infinite resistance and no current flows when a measurement is made. In that case there would be no current flow in the series resistor, so no voltage drop, so the metter would still read full battery voltage.

A real voltmeter has finite resistance. A cheap meter may have a 1 megohm resistance, maybe even 100 kohm - but there is no good reason for a digital meter to be under 1 megohm. When connected current will flow in the meter. This will have minimal effect on the battery voltage, but the current will cause voltage drop in the series resistor and the meter will read low by this amount. As the same current flows in meter and in the series R and as V= I x R , the voltages across resistor and meter will be in proportion to their resistances.

Say Rtotal = Rmeter + Rseries.

All the voltage will drop across Rtotal (of course).

So the proportion across the resistor will be Vbattery x Rseries/Rotal.

And the proportion across the meter will be Vbattery x Rmeter/Rotal.

If Rmeter = 1,000,000 ohms and Rseries = 10,000 ohms then

Rmeter/Rtotal = Rmeter / (Rseries + Rmeter)

= 1,000,000 / (1,000 + 1,000,000) = 1,000,000/1,001,000
= 0.999000999 ~= 0.9990 of original

ie for small Rseries compared to Rmeter the drop in voltage will be proportional to the series resistance compared to the meter resistance.

So a 100 k resistor with a 1 megOhm meter will cause about a 10% drop.
A 20k resistor will cause a ~= 2% drop etc.

So, in most cases, things like contact resistance and wiring resistance will have minimal effects on voltage measurements as long as there are no other currents flowing to affect the reading.

V1, R1, and LAMP1 are connected in parallel in your circuit. That means that the voltage across R1 is equal to V1 or 12 volts. Likewise, the voltage across LAMP1 is also equal to V1 or 12 volts. Thus the current through R1 is 12 volts/100 ohms or 120 milliamperes. The current through LAMP1 is 12 volts/100 ohms or also 120 milliamperes. Note that the current through R1 is the same as the current through LAMP1 because their resistance is the same, not because they are connected in parallel. In a practical circuit, the interconnecting wires would have some small resistance (small compared to 100 ohms) so that the voltage across R1 would be slightly less than V1. The voltage across LAMP1 would also be slightly less than R1 due to the voltage drop in the wires between R1 and LAMP1.

## Best Answer

A typical DMM has a very high (but not infinite) input impedance, typically ~10Mohm or bigger.

Now suppose you have very long leads. This will also have some finite resistance. Forming a voltage divider (I'm moving all of the resistance due to leads to above the multimeter. Mathematically this is equivalent to having two leads with 1/2 the length on each side):

\begin{equation} V_{out} = \frac{R_{DMM}}{R_{DMM} + R_{leads}} V_{in} \end{equation}

Computing the equivalent lead resistance for 1km 24AWG wire on each side, we get \$R_{leads} = 166 \Omega\$.

Then with a 10Mohm dmm resistance, \begin{equation} \frac{V_{out}}{V_{in}} = 0.9999834 \end{equation} Or an error of 0.00166%. You'd be pretty hard pressed even measuring this error with most multimeters, and errors from other sources will swamp any errors due to the voltage drop in the leads.

There is slightly more error if you add in the battery's internal resistance, but still not significant.