Electronic – Does a given size wire have a maximum power or a maximum current that it can transmit

currentpower

I've lost my 'intuitive feel' for current, voltage and power in a conductor (like a 18 gauge copper wire)…

The wire seems to be rated by maximum current rather than maximum power… Does that mean I can get a 'free ride' by using a high voltage in order keep the wire size low when transmitting power?

Simple example: let's say Im going to run a 18 gauge wire around my house to supply power to a bunch of 12 V devices… Can I actually support a maximum of 4x more devices if I supply 48 V in the wire system and then step it down at each destination.

Best Answer

As a general rule of thumb, the current determines the thickness of the wire, and the voltage determines the thickness (and/or material) of the insulation.

The power grid does pretty much what you propose - use much higher voltages to reduce cable size. The reduced current also means reduced line losses, which is very important over long distances.

For instance, if you had 1kW of power you you wanted to move from A to B, you could use, say 100A at 10V, or maybe 10A at 100V.

For 100A power transmission you'd need 1 AWG wire. That has a diameter of 7.34822mm, and a resistance of 0.406392Ω per km. So over a 1km distance you'd lose 0.406392 * 100 = 40.64V. Ouch. That would just plain not work! So although the cable could physically cope with that current, over that distance you'd loose all your voltage. So that would be a no-go.

Try at 100V, 10A.

10A can go through 11AWG cable. That's 2.30378mm thick, and with a resistance of 4.1328Ω/km. Much higher resistance, but much lighter cable. How much voltage would we lose over 1km? 41.328V. Factor in the return path, so you double the distance, you end up losing 82.656V, leaving 17.344V left for the load. Getting there. Still not workable, but getting there. That equates to 173.44W.

How about if we pump it right up to 1000V, at just 1A? At 1A we can use 21AWG wire, at 0.7239mm thick. 41.984Ω/km, which would be 41.984V lost there, and 41.984V lost back. So 83.968V lost from your 1000, leaving 916.032V. That's 916.832W coming out.

Now supposing you wanted to transmit 100A over a 1km distance, and limit the voltage drop to say no more than 1V. What thickness of cable would you need for that? Well, for a 1V drop at 100A you would have a resistance of 1/100 = 0.01Ω. So your cable must have no more than 0.01Ω/km. The table I use doesn't go that low, so we'd need to do some calculations.

If we use copper wire, that has a resistivity (\$\rho\$) of \$1.68×10^{−8}\Omega/m\$ at 20°C. For resistivity we have the formula: $$ R=\frac{\rho L}{A} $$ where L is the length (1km), R is the resistance (0.01Ω) and A is the cross-sectional area.

So we can re-arrange that for A: $$ A=\frac{\rho L}{R} $$ and substitute our values: $$ A=\frac{1.68 \times 10^{−8} \times 1000}{0.01} $$ $$ A = 0.00168m^2 $$ And of course, that equates to a wire diameter of 4.6cm.

Is a 5cm thick cable practical for that? Not if you can increase the voltage to decrease the current, no.

So for power transmission it is possible to reduce the cable size over very long distances and reduce the losses over the line.

Over shorter distances the losses are considerably less, but can still be a problem at higher currents. But is it worth it, or better to just use fatter cable?

You also have to factor in:

  • The efficiency of the power conversion - step up / step down.
  • The cost of better insulation if you have very high voltages.
  • The reduction in cost by using thinner cable.
  • Safety issues - higher voltages are dangerous.

So is it a "free" ride? No. There will always be losses and caveats you have to look out for. It can, though, get around problems of longer distance transmission of power.

Related Topic