# The LED is dead… Why

currentinfraredledresistorsvoltage

I have one OSB4XNE3E1E IR LED datasheet

I use a battery type Li-ion 18650 and it's 1200 mAh – 3.7 V. The maximum voltage of the battery is 4.2 V maybe "when full charged" and when I made the circuit it was not fully charged so it read 3.99 V.

The datasheet above says the maximum voltage of the LED is 2 V at 700 mA and the typical is 1.7 at 700 mA. Now first: every time I measure the ampere of the circuit the ampere is always under 100 mA.
Before I said the second point I can't get the current of the LED that he can take it because on the datasheet its written 700 mA and when I link it its say something under 100 mA so I was confused and all I did is get some resistors and put it from the highest to the lowest to get the resistance I need.
Second. I put a resistance about 1 kΩ and it glows nice but the voltage is under the typical voltage of the LED:
So I lower the resistance and its glow more and the voltage go more BUT! The heat is going up in the resistor. Finally I get 10 Ω resistance and the voltage on the LED read 1.5 maybe or 1.6 but the 10 Ω the resistor gets too hot and its becoming hot very fast.. any way
I link the LED directly to the battery to figure out the problem and yeah the LED is dead.
After its dead it's now reading the voltage of the battery itself which is 3.99 (with very very very very ultra low glowing)

I don't understand even with low resistance the voltage of the LED that its taken from the battery is not able to go higher than 1.6 or 1.5 (I need it 1.7 – or 1.8 – 1.9)
Why?

And I have a second question which is the company of the LED how can they use 700 mA with 1.7 V?
I'm talking about the ampere because every time I measure it its read under 100 mA so how?

And also I have a third question which is linked to the second one
Does the ampere of the battery that the battery provide it is changeable proportional to the load?

The infrared LED died from too much current. Excess current overheated its internal structure. Be aware of a caution on the data sheet:

Note: Don’t drive at rated current more than 5s without heat sink for Xeon 3 emitter series

I get the impression that too much emphasis is put on LED voltage as if its voltage of 1.7V determines light intensity. While it is true that too little voltage will result in no light, too much voltage causes excess current (and heat). It is LED current that generates heat and light intensity. Since these LEDs are often run hot, temperature affects its voltage drop too. Use 1.7V only as a rough guide to determine LED operating point.

When you use a multimeter as an Ammeter to measure LED current, you encounter a problem with its internal resistance, especially in low-voltage circuits like this. Let us assume that when measuring on a 200 mA scale, Ammeter internal resistance is $$\1\Omega\$$. If you use this Ammeter to measure LED current, a small voltage (a little less than 0.2V) drops across its terminals due to its internal resistance (circuit on left).
Then you remove the Ammeter and replace it with a jumper wire having close-to-zero ohm resistance (circuit on right). Current flow jumps higher than you expect: simulate this circuit – Schematic created using CircuitLab

The QED123 infra-red LED is similar to OP's, one of few that CircuitLab simulates. When the current-limiting resistor (10 ohms in this example) is made smaller in an effort to get LED current larger, this Ammeter loading problem becomes worse.
It is probably better to monitor voltage across this current-limiting resistor, and then use Ohm's law to calculate current: $$\I={V\over R} \$$. It helps to use a current-limiting resistor having tight tolerance...using an ohmmeter to measure such a small resistor is usually not accurate, since the contact resistance plus ohmmeter wire resistance is significant.