Electronic – How are LEDs considered efficient

efficiencyledpower-dissipation

I've always found circuits containing LEDs hard to understand, please bear with me. I know most people find it easy, but I'm confused by them so some of my assumptions might not be correct, please correct me if that's the case.

So onto the question: Since LEDs are, after all, diodes, they essentially act as conductors with forward voltage, right? Which is why we need a pull-down resistor to regulate the current that flows through the circuit.

For example, let's say we have an LED with a Vf of 2 V and an operating current of 20 mA. (I think those numbers are reasonable right? Again, if not, please let me know.) And our power supply is a constant 4V. This means we need the resistor to draw 20 mA at 2 V, so it would be a 100 Ω resistor, with 40 mW going through it. That's a tiny power usage, but half of the power supplied is wasted through heat. So in this case, isn't the best case efficiency 50%? Which isn't really efficient in terms of DC power supplies, I would have thought.

So when people refer to LED's high efficiency, are they referring to the fact that the LEDs themselves convert the power they use into light efficiently, or is it considered efficient even after considering the 50% max wall plug efficiency?

Or is it just that I've given an example that happens to be a horrible circuit design that would never be found in production applications?

You seem to be getting confused between the efficiency of the LED and the efficiency of the circuit to drive the LED.

In terms of light output per unit of energy used by the LED they are an efficient way to generate light. In absolute terms they aren't great, they are around 10%[1] efficient in that respect however that is still far better than the ~1-2% of a conventional incandescent bulb.

But what of that power wasted in the resistor. A series resistor is the simplest way to drive an LED, it is far from the only way to do so.

Even sticking to a resistor what if we put 20 of your 2V LEDs in series and supply it with 45V? Now you are using 45*0.02 = 900mW of which 800mW is going into the LEDs and only 100mW (11%) is being used by the series resistor.

But we can make it even more efficient, the reason for the resistor is that the LEDs needs a constant current and most electronics are designed to supply a constant voltage. The easiest way to convert from one to the other (assuming a constant load) is to throw in a series resistor.

You can get constant current power supplies. If you use one of those to drive your LED then the resistor can be eliminated and you can get an efficiency of well over 90% of your total system power going into the LEDs.

For a home project or a simple indicator on a signal a resistor is a lot cheaper and simpler but if you are driving a lot of LEDs then the logical choice is to pay a bit more, have a slightly more complex circuit and use a dedicated constant current LED driver IC.

1. As noted in comments, 10% is a good ballpark for current household lighting and probably also about correct for cheap commodity LEDs using older processes. Newer single colour parts can achieve significantly higher levels of efficiency.