Electronic – Why do different resistors of the same power rating burn out / not burn out

ledmultimeterresistors

I have a circuit board for a small LED light, which comes with an annoying strobe mode. To disable this I have found that I can wire a resistor from a pin on a chip
to ground and the LED will stay in the always-on mode that I want. I'm trying to find the lowest resistance that the circuit board will tolerate (if it is too low, the LED turns blue and dims, and chips begin to smoke)

I started with a 1K 1/4W resistor and that worked but was much dimmer than I wanted. I tried a bunch of other resistors (all 1/4W) down to < 50 ohms.

Some resistors, or combinations of resistors, smoke and turn black (although they keep working) such as:

  • 1x 47 ohms
  • 1x 22 ohms
  • 1x 22 ohms, 1x 10 ohms (wired in series)

However, some resistor combinations work fine e.g. 2x 22 Ohms. I don't understand this as I thought the reason the resistors are burning out is due to their power rating being too low, but if that were the case any combination of resistor would burn out as they are all 1/4W.

So I measured the voltage from the pin to ground at 7V (makes sense as the power source is 11-12V) and I measured the current at 2A (with my multimeter in 10A mode and my probe in the 10A port). But that comes out to 14 watts which should instantly blow any resistor I wired in.

It is of note here that, when I measured the current using my multimeter, the circuit board didn't tolerate it (as described at the beginning) presumably because the multimeter offered little resistance in current measuring mode.

Could someone please clarify what I'm missing here as my observations don't seem to match my measurements at all. Sorry for any obvious or stupid oversights as I do not yet understand how the different mechanisms of electricity work.

Here is a picture of the board I am working on: click

The resistor in that picture is 47 ohms (though it measures 1.1K when the board is powered). The voltage across the resistor is 7V giving me a calculated current of 140mA and power of just under 1 watt.

Best Answer

To determine the actual power dissipation you should measure the voltage across the resistor when powered on.

The power dissipation is then \$V^2/R\$. The most voltage that a 47 ohm resistor can have and still be within the 1/4-W rating is 3.4V.

It does not matter what the current capability is of the supply (and you should never put an ammeter across a voltage source because you could damage the source or your meter), what matters is what the voltage is under load and the resistance.

If you have multiple resistors in series or in parallel, each will have it's own dissipation depending on the resistance and the voltage across it.

When you use (or abuse) the LED the voltage across it will change so the voltage across the resistor(s) changes from open-circuit, and of course you are also fiddling with the resistance.

Related Topic