Electronic – Why do we use low resistance cables to minimize power losses

losspower

Generally, one would like to minimize power losses via Joule heat. Intuitively, using low resistance cables (for example by using materials with low resistivity like copper) should do a better job than highly resistive cables.

However this is not what I get mathematically. Here's my reasoning:

Let's assume that Ohm's law holds, so \$V = RI\$, and let's assume that our power source is a voltage source, so \$V\$ is constant and fixed. Let's consider 2 cases, one with a resistor R and the other with a resistor R/2, where "resistor" means the total resistance of the electrical circuit, so cables + load.
In the 1st case, the power dissipated is \$I^2R=V^2/R\$. In the 2nd case, it is \$I^2(R/2)=2V^2/R\$, i.e. twice as much power is dissipated if we halve the total resistance of the circuit. It seemed counter intuitive at first to me, but when one halves the total resistance of the circuit, we double the current, and since the power losses go like \$I^2R\$, the increase in current more than offsets the decrease in resistance, so overall the power dissipated increases. So, I am lead to wonder why would one want to use lower resistive cables? Wouldn't it mean a higher current, thus a higher dissipated power in the circuit?

The only reason I have found so far is that the voltage available to the load is greater when the cables have a low resistance. I am wondering if that's the reason why we use low resistance cables, because it seems like it still dissipates more power than if we had picked a material like iron instead of copper. So it seems that the premise of the question may be wrong, i.e. in reality we might not seek to minimize power losses.

Best Answer

You are assuming cables and load to be one "unit". But this does not make sense in this case.

Typically, your load requires a specific power and for that it needs some specific current (calcualted by Ohms Law). This current now creates additional losses in the cables and these losses are lower when the resistance is smaller.

Take, for example, a 500 W computer power supply. When the computer requires 500 W, the PSU will deliver this power - independent from the input voltage, at least over some specified input voltage range. But a resistance in the cable creates a voltage drop, so the power supply needs a higher current to deliver the 500 W. So you get high losses in the cable.