Is resistor on LED’s necessary after voltage drop

current-limitingledresistorsvoltage

Let's say we have perfect 5V DC 0.1A power supply. Obviously for one LED we would need a resistor (I'm well aware of Ohm's law), BUT what if we connect bunch of LED's in series. And the current drain is well above the power supply, so voltage on those LED's drops to 2.5-3V. Would you still need to add resistor if LED's voltage is about 3.2V? It might be a silly question, but I thought it'd be cool to not have a resistor in such case (because LED's would be brighter that way) and better ask and be sure than be sorry, right!? Thank you in advance!

Best Answer

Short:

GIVEN:

  • A Power Supply (psu)
    Providing Vcv (V constant voltage)
    with maximum current = Icc,
    and providing Icc at reduced voltage when R load < Vcv/Icc

    Define: Vpsu = voltage at psu terminals

(1) Placing N LEDs in parallel across the psu with no resistors will not damage any LED if current in highest current LED is <= I_LED_max_

(2) Adding resistors can increase light out:

  • For a load of many LEDs in parallel with no resistors,

  • Then if Vpsu = Vload < Vcv (so Iload = Icc)

    adding series resistors per LED, but so Vload still < Vcv will increase total light out.

    How much depends on several factors - see below.


Let's say we have perfect 5V DC 0.1A power supply.

Assume that means it provides Vout = 5.000...V for Rload >= 50 Ohm = Ioit <= 100 mA. Also assume it provides 100 mA at whatever voltage suits for Rload < 50 Ohms.

Obviously for one LED we would need a resistor

No. That depends on the LED. If the LED Vf is < 5V at 100 mA it will draw 100 mA at < 5V. If the LED is specified to oprate OK at 100 mA it will be OK.

BUT what if we connect bunch of LED's in series.

Lowest common Vf for visible LEDs is ~~ 2V for red. (Varies widely in special cases).
One LED = 2V
2 in series = 4V 3+ in series = too many
So "bunch" <= 2 for proper operation.

And the current drain is well above the power supply, so voltage on those LED's drops to 2.5-3V.

I assume this means Itotal >> 100 mA if all LEDs operated in "normal" Vf range.

Let's put the bunch of LEDs in parallel to make sense of this
Let's say 20 x 20 mA at 3.2V spec LED's in parallel.

Would you still need to add resistor if LED's voltage is about 3.2V?

No from a ratings point of view.
Maybe from others

If 20 x LEDs rated at 20 mA at 3.2 V were connected to a 100 mA max power supply then they would draw 100 mA. The average current per LED = 100/20 = 5 mA.
If the LEDs are reasonably modern from a reputable manufacturer then, while they will not all draw exactly 5 mA, it is exceedingly unlikely (close to certain) that any will draw over 20 mA when 20 are placed in parallel. So, no LED exceeds its rating.

It might be a silly question, but I thought it'd be cool to not have a resistor in such case (because LED's would be brighter that way)

As above, if the LEDs are in parallel and none exceeds its current rating then they will be safe BUT light per LED will be uneven and light out will porobably NOT be maximised.
Because - LED efficiency in terms of lumen/mA rises as mA falls. The result varies with LED but as a guide 5 to 20% more light per mA may be obtainable at 10% of rated current than at 100% of rate current.

Then, the brighter LEDs make LESS light per mA than the lower brightness LEDs so the ones that "hog" the current make less brightness per current but take a larger share of the current. If the "spread" of currents is small and/or of the efficincy change with current is small then the light output difference will be small. As current rage rises and as efficiency range rises the loss of light due to current hogging rises.
A simplistic worked example shows the likely order of result.

Example - If you have 20 LEDs and the highest current one draws 20 mA, and each less brighter one draws 80.245% of the prior one the the total drain will be 100 mA (actually 99.99936 mA)
ie with I rounded to 0.1 mA LEDS draw 20, 16, 12.8, 10.3, 8.2, 6.6, 5.3, 4.2, 3.4, 2.7, 2.2, 1.7, 1.4, 1.1, 0.9, 0.7, 0.5, 0.4, 0.3, 0.3 mA
A spreadsheet calculation shows that light loss is only about 3%.

ie the loss of light from current hogging is not liable to be vast.

Worse is that the differences can be substantial and very visible.
I have seen lights made with a single common series resistor and 6 x parallel LEDs.
At full power the differences were not visible. At lower powers the differences in brightness were immense.

In the case of a CC supply ADDING per-LED resistors such that V_psu is stil < Vcv, the total LED brightness will INCREASE slightly. This is because power in increases as V rises, LED current is still I_psu_max, VLED rises and so LEDs operate at a more efficient point.

SO

If operating N LEDs from a current limited power supply where Isupply is much greater than N x I_LED_max_rated, then no LED is liable to exceed its rated max current value and no damage will be done.