Electronic – LEDs seem to automatically self-regulate to achieve ideal voltage across terminals

circuit-theoryiv-curveledpower supplyresistance

Note: the key question is now in bold at the bottom, for the people who have had trouble figuring out what I'm trying to ask. Sorry about the meandering explanation.

I've been experimenting with resistive droppers lately, and designed the following very simple circuit:

Resistive dropper circuit

To clarify, this is not just theory. I did this in the real world. The simulator screenshot is just to show how the circuit is arranged.

The input is 120v AC, 60Hz. I used two generic high brightness white LEDs (rated for 30mA, 3v). I didn't do any rigorous calculations, and just made ballpark estimates. And yet, somehow, when I measured it, there is exactly 3v across the LEDs. How on earth does this work so well? Are the LEDs somehow self-regulating their resistance so that there's as close to 3v across their terminals as they can get? Or was I just somehow really lucky?

I later measured the current traveling thru the circuit, and it's around 2.4mA, which is exactly what you'd predict according to ohm's law (120 v / 50000ohm = 0.0024A = 2.4mA). (All measurements were done with a okayish digital multimeter. I don't have an oscilloscope, sadly.) I've tried running thru what I know, and so far have figured out that the LED does not have a linear resistance curve, which is, of course, no surprise. At 30mA and 3v, it has 100 ohms of resistance. But when I try to use that figure to predict how much voltage drop to expect, I end up with around .23v. I've tried looking up the electrical characteristics of my generic-y LEDs, but the distributor didn't give a part number, let alone a datasheet.

(I later actually tested the current draw at 3v, and it was actually 15mA, which gave me a nominal resistance of 200 ohms (3v / .015A = 200ohm). Which didn't help matters — that just predicts a current of 0.47v. I've also tried plugging everything in at once without rounding, which unsurprisingly doesn't help either.)

I thought maybe there was some sort of AC black magic here that somehow involved the minimum forward voltage of the LED, but I tested it and it's around 2.5 volts… besides, I see nowhere for voltage to build up.

So… is this normal? Are the LEDs somehow "self regulating" their resistance so as to get 3v across their terminals?

Best Answer

The answer to your question is "Yes", at least to some extent. An LED is a diode, and semiconductors have nonlinear behavior. Those are fancy words meaning, "not like a resistor". In particular, the fact that they don't conduct (significant) current until they reach a certain voltage is mostly what you've found. Silicon diodes will have around .6 to .7 volts across them when conducting; so will a transistor's base-emitter junction. Germanium does the same thing at around .3 volts. Zener diodes have this behavior at some rated reverse voltage. LED's do this at around 3 volts (depends on color because of materials and doping).

After the diode junction is conducting, if you try to raise the voltage, the diode will try to conduct more current. That's because once it's conducting, it acts as if it has a fairly low series resistance. In fact, a simple model of a diode is a voltage source in series with a small resistance. If, in your circuit, there is any other significant resistance (your 100k is way more than enough), then the increased voltage appears across that resistance, and the diode just draws more current.

I have seen a red LED used as a voltage regulator. I wasn't a particularly good regulator, but it was enough to do the job in that application.