Electronic – current limiting resistors

current-limitingledttl

I notice that in Ben Eaters project for the 74 series logic chip computer, which my son and I are building, sometimes he uses current limiting resistors on LED's on the pins of some chips, and on others, he madly throws caution to the wind, and slaps that LED right in there with no apparent resistor.

I assume this means some kind of inner knowledge of each chip, and that the outputs of each chip are current limited with an internal resistor.

How can I determine which chips are safe to put the LEDs in without a resistor, and which ones require the current limiting resistor.

Best Answer

How can I determine which chips are safe to put the LEDs in without a resistor, and which ones require the current limiting resistor.

By reading their datasheets. This really should have been obvious.

Unless a chip is specifically intended for driving LEDs or has a controlled current output, you really should be using series resistors.

Most digital outputs are specified to source or sink some minimum current with some maximum voltage rise or drop. For example, a specification might be 15 mA sink at no more than 500 mV for a logic low level.

The naïve, lazy, or just irresponsible might say that 15 mA is less than the 20 mA my LED is rated for, so no problem. It doesn't work that way because 15 mA is the minimum spec, and there is usually no maximum spec. Any one part may be able to do 20 mA or more. And, that's just at 500 mV output voltage rise in this example. Let's say you connect a green LED with 2.1 V between a 5 V supply and this digital output. The digital output will be forced to well over 2 V. You don't know what current it might sink then. Since that's out of spec for the digital output, you don't even know if it will survive that.

Let's look at what the correct answer would be in this example with a series resistor. In this case, you have to design for 15 mA, since that's the lesser of what the LED can handle and the digital output can safely sink. You might say that since the LED drops 2.1 V and the digital output is at 500 mV, the resistor drops 2.4 V, but that would be wrong. The digital output is only guaranteed to not exceed 500 mV when sinking 15 mA. It could be less and probably is. You don't know how much less, so you have to assume the worst case of the digital output being at 0 V.

That leaves 2.9 V across the resistor at 15 mA. (2.9 V)/(15 mA) = 193 Ω. In this case you'd probably use the next common size up, which is 200 Ω.

Let's see what we've got. If the digital output stays at 0 V, the current will be (2.9 V)/(200 Ω) = 14.5 mA. If it rises the maximum amount but still within spec, then the current is (2.4 V)/(200 Ω) = 12 mA. You could get anything within the 12 to 14.5 mA range. Actually the range would be even more if you take the variation in LED forward voltage into account. Fortunately, it takes a fairly large current difference to see a obvious brightness difference, even in a side by side comparison. A 25% variation is pretty much inconsequential. It takes about a 2:1 current difference to notice the brightness difference when the two are not side by side.

There are naïve, lazy, and irresponsible circuit designers. A disproportionate fraction write up web pages about their designs, so there is a lot of crap out there. The most likely and simple answer to why some circuits have resistors in series with LEDs and others don't is that the ones that don't are bad designs.