I have a few 'old time' LEDs from the late 80s and early 90s. The red and green 5mm LEDs (amber was a rarity, blue was 'impossible' back then). Not being very smart, I used to test them with a 9V battery without any resistor and, surprisingly enough, they always outlived the experience.
Fast forward to the third millennium: I bought several dozens transparent 'high-efficiency' (or should I call them 'high brightness', hard to tell without a datasheet) LEDs on the Web. They are superbright, but the one time I tried to "Oh, here's a 9V battery: let's see what color this is" one of those, they almost instantly died after a faint flash that told me "I was blue, you #@@#!"
(Like this: https://www.youtube.com/watch?v=7IoyYj6BJlc )
Now, it is clear to me that a resistor is required to limit the current, but my question is about what exactly kills the LED, or put in another way: why do 'old technology' LEDs survive?
Is it related to the fact that 'old' LEDs exploited recombination between conduction and valence band of a sturdy PN junction, while 'new' LEDs are based on more exotic etherostructures that create quantum wells?
Or is it because the 'old' manufacturing process used bigger dies, or thicker bonding wires, or materials that were so lossy that they provided enough series resistance by themselves?
I think I owe an answer to my two dead blue LEDs.
EDIT: just re-did the experience with an 'old' red LED: I can leave it on for seconds without a problem using the same battery that zapped the 'new' LED.
New EDIT : While I can let the LEDs light up for a few seconds, I managed to blow one up when trying to measure the current. So, they are harder to damage, but not immortal after all. Tried three – four more old LEDs and I can confirm that for at least one second they survive (appearently) unscathed. New LEDs die almost instantly.
I will try later to measure the current in a more controlled setup, possibly with short pulses.
I love the smell of burning GaAs in the morning.
Yes, newer LEDs are also static sensitive. I learned this the hard way when testing a batch of blue SOIC chips with (unknown to me) a soldering iron with defective ground which was later found to be floating at >30V. I can assure you that the LEDs did not work after this experience and it wasn't heat as a single touch to one side of a diode at even 100C ruined it. Some started flashing like demented strobes, some just died completely.
Incidentally newer LEDs based on quantum wells are also highly sensitive to ionizing radiation, I learned this by reading about folks venturing into the ruins at Tchernobyl and Fukushima. The white LEDs in their caving lamps would often begin to flicker and eventually fail, at survivable (for humans) radiation doses. Silicon carbide ones are less so but still eventually fail, rumor has it that Cold War era LED technology is still used today on the ISS Zvezda module and the Progress spacecraft. I did also find that some GaN based blue LEDs can be used as varicaps, in some cases with no effect from brightness loss. The mechanism can generate 100pF changes comparable with an expensive part.