Electronic – Confusion with voltage drop equation

ledresistorsvoltage divider

I had never really thought of resistors causing a voltage drop, I have only used them to limit current to say an LED.

Using something such as E = IR I can use a single resistor as an example:

5v / 100ohm = 0.05A * 100Ohm = 5V

Now I can maybe assume that is 5 volt drop, leaving 0 after it. How does this allow an LED that requires, say, 1.8 volts that it drops if it leaves 0V at the other side of the resistor?

Using a simple circuit simulator..

5v ------- 220Ohm resistor resistor -----1.8v-----led-------0v

Why is 1.8V "left" that is required by the 1.8 drop led?

It is apparent that the resistor just "consumes" 5 – 1.8 volts magically.

I am confused by this, as you can calculate voltage dividers much more simply. Although, does a voltage divider "split" some of the voltage to ground? it is parallel however, so I am unsure why one branch (with resistor) affects the second branch (the output)

Best Answer

The 1.8V is a given for the LED; if current flows through a LED it will show a more or less constant voltage across it, just like a common silicon diode will drop about 0.7V at low currents. The anode will always be 1.8V higher than the cathode for your type of LED (the actual voltage mainly depends on the LED's color). The fact that this voltage is constant allows you to calculate the appropriate current limiting series resistor:

\$ R = \dfrac{V_+ - V_{LED}}{I_{LED}} \$

or the current for a given resistor value:

\$ I_{LED} = \dfrac{V_+ - V_{LED}}{R} = \dfrac{5V -1.8V}{220 \Omega} = 15mA \$