Calculate the temperature rise in a wire due to current

currentheatwire

An insulated wire, if I pass a current at a given voltage, what will the temperature rise be.

Below is what I think I understand so far …

I'm assuming the insulated wire is a 5 m long single conductor in free air ambient air at 20°C, with no active cooling.
A typical 20 AWG stranded wire is 0.62 mm2 CSA and has a resistance of 32.4 Ω/km.

A device draws a constant 10 amps at 12 V through the wire.
There would be a volt drop over the 5 m wire.

10 A x (5 m x 0.0324) = 1.62 V

Total power passing into the wire is P = IV 120 W
and power lost in the wire is 1.62 V X 10 A = 16.2 W

This power 'loss' will try to heat the wire.
The natural cooling effect around the wire will tend to cool the wire towards ambient.
The difference between these two factors should result in the temp rise of the wire.

But how would I calculate the Temp rise in degrees C?
I am aware that the insulation has an absolute upper operational limit of 105°C.
I appreciate that the insulation will have a big effect this, are there ball park values for PVC or other ways to estimate this value?

Best Answer

Calculating the temperature rise of a wire is hard (much easier/less time consuming to experimentally verify) why? Because you need to know the thermal conductivity of that specific type of PVC and also how much heating per unit area.

When calculations get hard engineers typically try and 'bound the problem' with limits. A better question to ask would be "can I run 10A through the wire?".

A chart like this (or other ampacity charts) might help to show the max current the cable can handle. They don't show their references however. The chart I have shows 14A. The best thing to do would be to check the datasheet or contact the manufactuer for this information.

enter image description here
Source: https://www.multicable.com/resources/reference-data/current-carrying-capacity-of-copper-conductors/