Electrical – How to calculate heat in wire

heat

I'm fairly new to electronics, but I understand the relation between current resistance and voltage. I wanted to make a heating element for a project of mine. I have a spool of 14 awg bare copper wire, but I don't know how many amps I need. I'd like to get the heat output on the wire to 60-70 watts. If anyone could tell me a formula or relation between current and/or resistance and wattage, that would be great.

Thanks for your time

Best Answer

If your heater is to be directly mains-powered you might be better off to buy a commercial heater as it will be properly designed and insulated. Popular types are band, strip, cartridge and calrod.

You can design the heater take whatever current you happen to have available using a given resistance of wire, however the length of wire will be fixed at something that is probably inconvenient, and the voltage may not match what you have available.

For example, if you design it for 1A you will need 65 ohms (and 65V), which is 65\$\Omega\$/8.286m\$\Omega\$/m = 7.8km of AWG14 wire. Not very practical.

If you design it for 12V power, you would need 5.4A and more like 260m of wire (still a lot).

At 3.3V you would have 20A but you'd still need 20m of wire. That's about the lowest voltage supply cheaply available (within the range of a 500W ATX supply).

That is one reason why heaters are normally made with nichrome, Kanthal or other high-resistance (and refractory) alloys.

Related Topic