Electrical – why reduced resistance and increased current result in an increased amount of heat

current

Would a heating element have a very high resistance, or a very low resistance? (All comments in this post are based around the fact that the voltage is the same for each situation) I would have thought that a higher resistance would have resulted in more heat loss, but I've been taught that the higer the current, the more energy is lost to heat. Therefore, a lower resistance would release more heat.

Which one is right? Thanks for any help.

it is difficult to visualize the fact that reduced resistance and increased current generated more heat . If somebody try to explain me without much math because I know what

Best Answer

All of this relates to two things:

Ohm's Law: \$ R = \frac{V}{I}\$

Joule Heating \$ P_\text{Heat} = V \cdot I\$

The first tells us that if we keep the voltage \$V\$ a constant, the current will increase when the resistance decreases. This makes sense since the resistance is a measure of how hard it is to have current flow from one node to another - if it resist less, more current can flow.

The second then tells us that the power increase with current and with voltage. If we keep the voltage constant, but increase the current, the power will increase.

In a resistor, all this energy is turned into heat. Thus, if we have more power being dissipated in the resistor, it will get warmer if it has a lower resistance (given it has a constant voltage across its terminals).