Electronic – why ohmic losses increase with resistance

ohms-lawpowertheory

Electrical newbie here. I'm trying to get understanding how is resistance involved with power dissipation (ohmic losses, eg. heating). Primarily I'm looking at Electromagnet coil, it says the losses are \$P=I^2R\$, so reducing resistance R reduces heating power loss P, which sounds reasonable as it comes from (I assume):

\$P=UI\$ and Ohm's law \$I=\frac{U}{R}\$ (and thus \$U=IR\$), so by substituting U we get \$P=UI=(IR)I=I^2R\$

But, if one substitutes I instead, he'd get \$P=UI=U(\frac{U}{R})=\frac{U^2}{R}\$. Which seems to tell me, that with stable voltage source U one would get lower power loss (heating) with increased resistance R!

Which unfortunately also makes sense to me from empiric point of view: If I connect very high value resistor (or its equivalent – a veeery long wire) across 230V line, it would only heat a little, and I put very low value resistor across 230V line, it would heat so much it would burn (which I guess is what fuses do for living). (replace 230V AC with 9V DC battery if AC/DC distinction matters here)

So I guess I'm missing something basic – would increasing resistance reduce or increase power losses? Or is wire in "put it in 230V socket" example behaving completely different that wire in electromagnet example (and if so, why?)

Best Answer

The magnetic field strength is proportional to ampere-turns. If you hold the field constant, and reduce the resistance (for the same number of turns), then you must reduce the voltage.

The power loss for a given field strength and number of turns is proportional to the resistance.

If R goes to 0, as in a superconducting magnet, no power at all is consumed in the magnet in steady state.