Electronic – How to calculate the resistor value for a simple LED circuit

ledresistance

I have a simple circuit:

circuit

The max rating (current) of the LED is 30mA.

How can I work out what the resistance of the resistor needs to be? By using ohm's law I found it to be \$3V/0.03A = 100 \Omega\$. However using software called Yenka, and trial and error I got the minimum possible resistance to be 36Ω. However, if I use a 35Ω resistor, then the LED breaks. Is the software wrong, or (more likely) am I doing something wrong?

Best Answer

You'll have to check the datasheet or measure it to know how much voltage drops over your LED. Let's say this is 2V. Then the voltage over the resistor is the difference between your power supply (3V) - voltage over the LED (2V) = 1V. To get 30 mA through the resistor (and thus also the LED) your resistor has to be 1V / 30 mA = 33 Ohm.
If your LED voltage is lower the current will be somewhat higher, but the LED shouldn't break!