Electronic – Why does the resistor get warm

heatledresistors

I am currently using 150R series resistors to limit the current to each segment in a 7 segment display. I picked 150R because all the online "led resistor calculators" suggest this value. The power supply is 5v/2.5A.

The datasheet for these displays (Kingbright SC52-11EWA) says the LED's have a forward voltage of 2.0 (2.5 max).

For testing, I've currently just got one segment wired up with a resistor. For some reason, after a few minutes the resistor gets quite warm/hot. The displays themselves do not heat up.

When I use a 330R resistor, it is still slightly warm but a tiny bit cooler than 150R. However, the display is then noticeably dimmer, especially when viewed during the day.

I've never had problems with resistors heating up when lighting LED's.

  1. What am I doing wrong?
  2. What value resistors should I be using?

I've attached a picture of the breadboard, if that's relevant (The display is much brighter than this, the flash just makes it look dim).

test

Best Answer

This can be easily calculated.

The power supply is 5 V and the LED drops 2 V. That leaves 3 V accross the resistor. 3 V / 150 Ω = 20 mA, which is a typical max current for small LEDs. That means the LED is driven correctly.

Now look at the power dissipation. 20 mA x 3 V = 60 mW. That's well within the capability of what looks to be a "1/4 W" resistor in your picture. Again, everything is fine.

Dig out a datasheet for the resistor and see how hot it is expected to get if you actually were to have it dissipate 1/4 W. That would probably be in the 150-200°C range. Even at 150°C for 250 mW, and assuming 20°C for ambient, you have 130°C / 250 mW = 520°C / W. 60 mW would therefore heat the resistor 31°C, which you can definitely feel. If starting at 20°C, then the resistor would be at 51°C, or 124°F. So it makes perfect sense that it would feel "warm" or almost "hot" to you.