Electronic – Determining minimum safe wire width at high amps in a DC 12V system

12vawgcurrentdcvoltage

In a 12V DC system, given a particular maximum current and length of wire, what is the best method to determine the minimum width (gauge) of that wire? In particular, what method is best for high current levels and short lengths of wire?

The typical way to do this seems to be to calculate the width of wire necessary to achieve a given voltage drop, e.g. 2%, perhaps using an online tool. However, if I understand correctly, the % voltage drop is not actually what is relevant in terms of fire safety, but rather the number of watts of dissipated heat due to that voltage drop per unit length of wire. Running 200A over 1 foot of 10 AWG wire will result in roughly the same % voltage drop as will running 20A over 10 feet of the same wire, but in the first case there will be much more heat generated per foot of wire, so presumably there will be a higher risk of fire hazard and melted insulation.

Is it safe to assume that a 2% voltage drop will not create a fire hazard even for high levels of current over short lengths of wire?

Best Answer

Wire has two limits, voltage drop and heating. Voltage drop depends on length. Heating does not. Therefore you cannot relate a voltage drop limit to a heating limit. 2% voltage drop will be safe on wires longer than some critical length, and unsafe on wires shorter than that.

What damages a wire is temperature rise. Therefore it's not just the current that the wire is carrying, but also whether the wire is single, or is running with others that are getting hot, and in what ambient temperature. The class of insulation also affects how hot you'd want to run the wire.

As a rule of thumb, we generally reckon to run about 10A/mm2, for single wires, up to a few mm2. For bigger sizes, the size you'd need for 200A, you need to reduce the current loading, that is increase the wire area above that, as the ability of the wire to dissipate heat only goes up as fast as its surface area.

There are many easy to find tables for 'ampacity', there's one on this page on wikipedia for instance. Here the tables show that for 200A, you'd need 107mm2 cable if it used insulation good for only 60C, but could get away with 67mm2 if it was good to 90C. That's for a single, unbundled wire at 20C ambient. Higher ambient, or bundled with other wires, would require thicker wire.

Unless of course the duty cycle was very short. It takes time to reach a dangerous temperature. Some applications could get away with thinner wires, like spot welding for instance, or starting an engine.

Generally, in a 12v system, voltage drop is your concern, unless the wires are very short. Calculate both voltage drop, and ampacity, and choose the thicker of the two.