I have an LED with no specs for it. I decided to measure the voltage drop across the LED, so I connected it to 5 V power supply and 325 ohm resistor. I measured the resistor with multi-meter, then I measured the current and I had 6.38 mA. I then calculated voltage drop across the resistor which was 2.07 V (\$IR=V\$) and then I calculated the voltage drop across LED which is \$5-2.07 = 2.93\mathrm{V}\$. So I wrote down my voltage drop across the LED.
The day after I was using the same LED on 3.3V circuit. I decided to measure the LED's voltage drop again and it turned out to be 2.633 V and according to Kirchhoff's law it would affect my current because I am connecting this LED before the resistor.
Can somebody explain to me what happens? Why is it that the same LED has different characteristics with different voltages?
Best Answer
I assume you used the same 325 Ω in both cases.
5V with a 325 Ω resistor and Vf = ≈ 7 mA
3.3V with a 325 Ω resistor and Vf = ≈ 2 mA
plugging your numbers in to a resistor calculator:
Source: LED Series Resistor Calculator
Looking on an IV curve:
2 mA ≈ 2.6 V
7 mA ≈ 2.9 V
Source: OSRAM blue LED
This is easier to show with a high power LED.
Let's say we want to make a flashlight with 1000 lumen output.
We select this 900 lumen LED.
This luminous intensity is measured at 400 mA and 85° C.
This LED's maximum current is 750 mA.
We have to up the current from 400 ma to get 1000 lumens.
1000/900 = 111%
So we go to the Relative Luminous Intensity graph.
Draw a line across at 111%
Draw a line down from the point where the 111% line intersects with the 85° curve.
We see that 475 mA should give us 1000 lumens.
We go to the IV curve and draw a line from 475 mA up to the 85˜ curve.
The draw a line from were they intersect to the forward voltage.
The forward voltage for this LED at 475 mA is 17.75V.
Let's say we are using a supply voltage of 24V.
We go to the resistor calculator and enter 24V supply, 475 mA, and 17.75V for the forward voltage.
So for 1000 lumens we need a 13.3Ω, 5 Watt resistor.