Electronic – If supply amperage is higher than max. amperage, do I need a resistor

amperagecurrentresistors

This is a pretty basic question, as I'm still learning the fundamentals of electronics. I understand the analogy where amperage is compared to the quantity of water moving through a hose.

I have a 5V-2A power supply and I want to power my circuit. Each components' pins have different maximum currents (250mA for PIC, 180mA for the LCD, etc.).

Now if I'm following the above "water quantity" analogy, it should be OK for me to connect a component that draws max. 250mA because it draws from the 2A instead of the power source forcing the current into the pin. Now, I will have 1.75A of current left for the rest of my circuit.

Do I still have to put a resistor between the power supply and the VDD pin of a component? If yes, why?

(This question arised from when I learned that there should be an approx. 100 ohm resistor between the microcontroller's output pin and the LED. The LED's current rating was 25mA and the pins maximum output current was 25mA too and I didn't understand why we needed a resistor in between.)

Best Answer

To the first order...

You are correct. The load controls the maximum current that may flow, while the source controls the maximum voltage available.

but...

You are not correct about your LED. That's a different problem. Your thinking assumes, Ohm's law which assumes linear (and in-phase) operation.

Diodes (including LED's) are non-linear devices. The Diode will present a constant voltage (approximately) when it is "on" independent of the amount of current flowing through it. The LED will be brighter with more current and burn-up (be destroyed) if too much current is allowed to flow through it for too long.

Notice how the line to the right of the y-axis in the figure is almost vertical. That implies that the voltage will change very little if the current through the diode changes a lot. V clearly does not equal IR for a diode.

enter image description here

Most discrete LED's in the microcontroller world hover around 2V at 20mA (varies by size, chemistry, and construction of the LED). If your microcontroller provides a 3.3V output through one of its general purpose pins (GPIO), then the current the LED demands from the circuit will exceed what the microcontroller can provide through its output pin and the internal resistance of the output driver in the microcontroller will limit the current to its maximum.

This will ultimately destroy the output driver of the microcontroller. To prevent this, a series resistor is added to limit the current explicitly to something safe.

You work backwards to size the resistor: (Vcc - Vled) / Iled = R

In most 3.3V microcontroller applications, the value turns out around 100 Ohms.