Calculate series resistor value for LED on 220 Vac RMS supply

ohms-lawresistors

The traditional way to find the resistor value I can find on the internet is:

(Input Voltage - Forward Voltage) / Current

In my case, the Input Voltage is 200 Vac, forward red LED voltage is 2 V and the current is 0.02 A.

That means: (220 – 2) / 0.02 = 10900.0. That implies I should be able to put a 10900 ohms resistor and put my LED on 220 Vac.

So that's what I did, I put my LED in this configuration:

enter image description here

[ modified schematic, removed diodes connected in opposite ]

Rather than using 10.9K, I used 10K and 1K resistor in series.

When I put that on 220 Vac, the whole thing just burns. When I search google for the proper resistor value, some say use 47K, some say use anything above 100K and some say use 200K. I have in fact, used 47K throughout my life, and it works flawlessly. So it got me curious: why doesn't it obey the simple formula of using 10.9K Ohm resistor?

Best Answer

There will be about 220V over the resistor while 20mA flows through it.

It will dissipate 4.4W as heat, which is quite a lot and you would need a resistor that can handle it or it will burn. Which is why it's really not practical to use resistors in this case.

The other thing that might cause burning is that in general LEDs are specifed to handle about 5V in reverse and you are connecting it to 220 VAC, in which case there will be peaks of 310V in reverse over the LED.

Do note that mains voltage can be dangerous and lethal if you are inexperienced working with mains voltage circuits.