Electronic – Rules for choosing power rating of a resistor

powerresistorstolerance

Let's say we're designing a system, and within it we have some resistor \$R\$ with a voltage difference \$V\$ across its terminals. Then it dissipates power \$P = V^2 / R \$.

Of course the resistor has a certain tolerance, so its value lies between \$R – \Delta R\$ and \$R + \Delta R\$.

The voltage has also a certain "tolerance" (maximum ripple over a regulated nominal voltage), so its value lies between \$V – \Delta V\$ and \$V + \Delta V\$.

When deciding what power rating to pick for the resistor, I believe it makes sense to put ourselves in the worst possible situation, so the highest possible dissipated power would be \$ P_{max} = (V + \Delta V)^2 / (R – \Delta R) \$.

If we assume we precisely know the values for \$ \Delta R\$ and \$ \Delta V\$, how much bigger than \$ P_{max}\$ should the resistor rating be? We already are in the worst possible scenario, should we give ourselves some more error-space anyway?

Also, is there a more robust method to compute power rating than the one I mentioned?

Best Answer

In engineering for commercial test equipment, we had a default rule that resistors should never be run beyond 50% of their rated power. This rule meant you didn't have to consider their contribution to the overall equipment MTBF, when the resistor manufacturers rate the dissipations for ambient temperature and a X0000 hours lifetime. Every so often, we couldn't meet this generous over-design, and then we'd need to do detailed calculations, including ambient temperature.

For military or auto under-hood electronics, you would derate even further for reliability.