Electronic – led resistor calculation, and resistor wattage

ledohms-lawresistors

I know this has been asked time and time again but bear with me as I am struggling to understand,
so say I have this simple circuit here
enter image description here

with 120v (dc for simplicity) as the input voltage
and I have an led that runs on 3.4v and max 20ma
ohms law states I=V/R so R=V/I, so R=120v/0.02a = 6000 ohms or 6k
is this correct? how do I know the voltage drop across the resistor is correct,
and what about power dissipated by the resistor?
also, Is there an easy way to calculate everything at once?

Best Answer

If you are running an LED from the mains you do not want to run it at 20mA. Any modern LED will be painfully bright at 20mA and your resistor will waste too much power. Try something more like 2mA (though you can use the same calculations, just substitute the current you want)

R = E/I = (120-3.4)/0.002 ~= 120/0.002 = 60K, so you can use 62K.

Power is \$E^2\over R\$ = \$\frac{(120-3.4)^2}{62000}\$ = 0.22. So you could use a 1/4-W resistor, though 1/3 or 1/2 would be better- it needs enough voltage rating as well as power rating.

Incidentally, a modern LED bulb of approximately 60W equivalent-to-incandescent light output takes only 6-7W so if you ran your LED at 20mA you'd be drawing almost half of that power for (relatively speaking) very little light- that's how inefficient a dropping resistor is.


If you are using the mains with a high voltage diode in series (and preferably another diode across the LED) you'll need to double the current to get the same brightness, so the resistor will be about half- say 30K, but the power dissipation will stay about the same. I'm ignoring the 11% difference between RMS and average for the purposes of a rough calculation.