Electronic – How to determine the best diameter for a cable

amperagecablesvoltagewatts

In my understanding having a cable that has a too large diameter can lead to power dissipation, having one that is too small can lead to the same result.

apparently the second case is much more popular in the "high frequency phenomena", the first case is probably caused by the material of the cable if i understood this correctly mainly because there will be an higher impedance.

i don't really get:

  • why this can happen? it's true what i have understood?
  • how i can calculate the correct diameter?
  • there are differences between AC and DC in this scenario?

I have X Volts and i have to provide a maximum of Y Watts, where i have to start to pick the best cable for the job?

Best Answer

You start by calculating the current: Y watt/X volt. The voltage is relevant for the cable's isolation, but not for the diameter. (That's not entirely true. If you work at Really Low voltages the voltage drop due to the cables resistance and possibly high current may become significant. Usually not for mains voltages and higher, though.)

Thicker cables has less resistance, so less power dissipation. I don't know where you read otherwise. This page has a calculator for the cable's required diameter. The same site also has tables for different kinds of cables.

There's indeed a difference between AC and DC. AC has skin effect, where the current will flow more need the outside of the cable. That "skin" is thinner as frequency gets higher, but already exists to a small extent for 50/60 Hz. So an AC cable may need a somewhat larger diameter, though this skin depth calculator gives a more than 9 mm skin depth for 50 Hz in copper, so that won't be a problem for most cables.