This is based on a problem that came up today. During the course of this problem I realized that I wasn't so sure I understood the relationship between wattage and heat produced.

In the past we did a test in the lab using .305 Ω /ft wire. The jacket is rated for 150C. We were able to get about 6.5A (at 3.66V) out of it at 24C ambient, with out exceeding the jacket rating. I want to estimate what the ampacity of 0.027 Ω/ft wire is. So I am wondering if I did it correctly, because this amperage seems a little high for the wire to handle, then again most copper wire is only rated at 90C.

So the math I did on it was this

So you do

.305 * 2 = .61 Ω /ft

**6A**^2 * .61 Ω = 21.62W (I^2*r = W)

21.96W * / 2ft = 10.98 W/ ft

So would it be safe to assume that if I did the same with a 0.027w/ft wire with the same jacked I would arrive at this amperage?

If you start with

11W/ft * 2ft = 22W

0.027 Ω * 2ft = .054 Ω

Sqrt(22W/(0.054 Ω)) = **20.18A**

ETA: we are planning on testing this tomorrow when we get some wire in. So we shall find out.

## Best Answer

I don't quite follow your calculations. But if you are doing I^2 *R as the power, and assuming the same max power for each wire.. then that's what I would have done.

However I went here and it looks like I*R is about constant. (?)

(I had to plot it.) Still looks linear.

Maybe someone can tell us both why.

Edit: I*R dependence. (Thanks Spehro, it was suddenly obvious on the drive home.) No matter the thermal loss mechanism (convection, radiation..) It will go as the area of the wire. 2 * pi *r * l (r - radius and l - length), so bigger wire will need more heat to get to a given temperature. (more later)