Electronic – LED Resistor, choosing resistor power

ledpowerresistors

I want to ask, when choosing resistor for a LED, how to choose power of the resistor(for example in SMD applications).

For example:

  • 1 LED 2.0V FV 20mA
  • Resistor 65 Ohm
  • Power supply 3.3V

With this configuration I've 20mA through resistor. Now, LED dissipate 0.04W. How I can choose correctly resistor power rating?

I should refer to LED power of 0.04W and choose a resistor of at least 1/20W? Or refer to current through line(3.3V*0.02A = 0.066W) and choose a resistor of at least 1/10W?

Should I apply a safty factor?

Best Answer

You know the current through your resistor and the value of your resistor, so you can calculate the power dissipated by the resistor (Ohm's law strikes again):

$$ P = I^2 R $$

So in your case that would be 0.026 W or 26 mW.

However when choosing the power rating, I'd not consider just the normal operating point but the worst case.

So the worst case would be, your power source failing with a higher voltage and the LED failing as a short circuit, in that case the resistor is dissipating the most power.

Well typically you don't consider both happening at the same time, but you might do that calculation as well.

If the LED fails as a short the current will increase and the resistor will dissipate (Ohm's law yet again):

$$ P = \frac{U^2}{R} $$

With your numbers: 167 mW.

The next thing to consider is what is actually meant when a manufacturer tells you "this resistor can handle 250 mW".

This value tells you the resistor will heat up a certain amount if you put 250 mW through it. Say 60 K (I'm making that up right now). So your resistor will heat up quite substantially in case the LED fails. Is your device able to handle that safely? I don't know.

The power dissipation is often also temperature de-rated, so at elevated temperatures your resistor is only allowed to dissipate a lower power. Does this limit your application? I don't know.