Voltage Drop Between Battery and Load

voltage

Why is the total length of wire in a circuit used to find the voltage drop between a battery and a load? Suppose the load is a distance l away from the battery. Wouldn't the voltage drop between the battery and the load be due to a length l of wire and not 2l? I understand that the current flows through a total distance of 2l of wire, but if the load is only a distance l away, why would it "see" a voltage drop due to 2l of wire? Is it possible to explain this with ohm's law and a simple circuit diagram?

Best Answer

Let's say we have a 12 volt source and a 12 ohm load located 100 feet away from each other.

Let's further state that the wire connecting them has a resistance of 100 milliohms per foot and - since each of the wires is 100 feet long - each of the wires will have a resistance of 1 ohm.

That's a total of 2 ohms, and being in series with the load's resistance of 12 ohms, that's a total 14 ohms.

Then, from Ohm's law we have:

$$ I = \frac{E}{R} = \frac{12V}{14\Omega} = 0.857\ amperes $$

and, since: $$ E = I R$$

the voltage drop in \$ \boldsymbol{each}\$ wire will be:

$$ E = I R = 0.857A \times 1 \Omega = 0.857 volt $$

Then, since there are two wires, each 100 feet long, connecting the supply to the load, the total drop in the wires will be twice that, or about 1.7 volts.

Related Topic