Electronic – Using a resistor to drop voltage for LED

led

Hoping someone can help me to understand what I am seeing.

I have the following:

  1. A 5V DC power supply derived from a USB power supply

  2. A LED (LINK) with a Forward Voltage requirement of 3.2V (Maximum Forward Voltage of 3.5V DC), and Continuous Forward Current of 20mA (Peak Forward Current of 100mA).

  3. A 91 ohm 1/2 watt flameproof resistor

I used the popular LED Calculator to design this ultra simple circuit.

However, before connecting the LED I figured I'd use my multimeter to confirm that the resistor was properly dropping the voltage to 3.2V.

I tested this by placing the resistor first at the +5V side, and then at the GND side, of the appropriate multimeter leads — but I am still seeing 5V on the meter.

I then place two of the same resistors in series, at both polarities, and I am still seeing the 5V on the meter.

Is there a simple reason I am still seeing 5V? Even without a load, I would have expected to see a voltage in the realm of 3.2V.

Thanks in advance.

Best Answer

By putting the resistor in series with the multimeter leads, you will always see a voltage of 5V since the multimeter (with effectively infinite resistance) will act as a potential divider with the source, and compared to the 91 ohms, your meter will take up all of the voltage. This happens no matter which way you connect the meter. Whats worse is that the LED isn't even connected to begin with. The voltage dropping happens when the circuit is fully connected, and for a voltage to actually drop, a current has to flow. Connect your circuit so that it turns on, then connect your voltmeter in parallel to each source. You will observe that the voltage across the LED is 3.2V, and across the resistor is the remaining 1.8V (This is due to KVL but i wont get into that).

schematic

simulate this circuit – Schematic created using CircuitLab