Electronic – Is higher or lower resistance wire able to heat up more? Are there other factors

heatpowerwire

Why is it that low resistance wire in vaporizers are considered to product more heat, yet for my engineering project thin high resistance chromium wire seems to heat up most?

According to P = IV and V = IR it does seem that lower resistance wire should create a higher wattage and therefore a higher heat output, yet in the process of designing a road heating system it seems that the thinnest high resistance wire creates the most heat. Am I missing some major key of thermodynamics? Or am I pulling some kind of black magic?

Best Answer

Part of the problem is "what do you mean, heat up more?" - higher temperature, or more heat - they are not the same thing. An incandescent bulb filament is much hotter than an electric stove element, but the electric stove element delivers a lot more heat (and power) unless the bulb in question is something huge and theatrical/industrial in nature.

Regardless, R is irrelevant unless and until you have a fixed or limited V or I.

For a fixed voltage with unlimited current, the lowest possible R will give the most POWER, as current will rise to infinity. Of course, that does not happen with real supplies, so the lowest R that allows full voltage to be supplied at maximum available current will give the most power.

Now, if you want the highest wire temperature, rather than the most power/heat, something very skinny and preferably made of tungsten and stashed in an inert gas or a vacuum will do nicely. If it needs to be in air, Kanthal or similar. If all you are doing is melting ice, wire type is not terribly challenging, since nearly any common wire type will do just fine boiling water so ice-melting temperatures are not too difficult.

For a fixed current, with unlimited voltage, the highest resistance will get you the most power, but again, practical supplies don't generally DO "unlimited voltage" and all sorts of unpleasant life-safety and plasma discharge related issues come up fairly soon as you crank up the voltage.

For practical devices, you generally have a supply limited in voltage and current, and you choose your heating element resistance to make the most effective use of those limits to get your job done - or determine that you can't and get the design changed so you can use (say) 480V 3-phase rather than 240V single-phase if you need more power than you can reasonably expect to get from 240V single-phase. Or you add a lot of thermal insulation so you can do the job with less power, or whatever. It's design, you solve it.

Likewise, you need to not burn out whatever your heating element is - when they melt, they tent to stop working (which is why I called out tungsten and kanthal - look up melting points, as well as for your nichrome.) You can dump 10 kilowatts into a 24 guage nichrome wire, but not for long.

If I infer correctly that your actual application is melting ice on road surfaces (a major waste of electric power that could far better be used to pump a fluid to pick up geothermal heat (or even burned-fuel heat) to perform the same task, but that's veering off-topic) then you need to be concerned with power delivery (each pound of ice requires a certain amount of heat to melt it) rather than particularly high temperature (you need an adequate SURFACE temperature for whatever rate of ice-melt you require, but it's not going to be 500°C...)