Say I have a **9V** supply, and I want to power an LED that requires **3V** and **0.02A**. From what I understand, I need to add a resistor that will drop 6V, and reduce the current to 0.02A, and I can calculate the resistance needed using ohms law. So, 6/0.02 = 300Ω – simple enough.

But how can this be the correct resistor when there's plenty of other equivalent fractions that would produce the same result? 3/0.01 also gives 300, so surely you could calculate that this resistor will only drop 3V and reduce the current to 0.01A? What am I missing here?

## Best Answer

This tells you what resistor you'd need if you wanted 10 mA, and you had a supply of \$V_f + 3\ {\rm V}\$. Since that isn't your situation, this result is irrelevant to you. Whether it happens to produce the same resistance required as your situation or not.

If you want to drive 30 miles in 30 minutes, you need to drive at 60 miles per hour. Similarly, if you need to drive 120 miles in 120 minutes, you also need to drive 60 miles per hour. Getting the same numerical result for the 2nd problem doesn't invalidate the solution to the first problem.

In comments you asked,

Its drop will be very close to 3V, because the LED's differential resistance will be much lower than 300 ohms.

Also, if you're targeting 20 mA, and the datasheet specifies the forward voltage at 20 mA, then the specified forward voltage is the best estimate you have for the forward voltage at 20 mA.

If you had a 3.5 V source and a 3 V LED and tried to control the current with resistive limiting (giving you a 25 ohm calculated resistor value), you'd probably run into significant errors, due to the LED forward voltage changing with temperature and manufacturing process variations.