Electronic – What exactly does a resistor’s tolerance rating mean

resistorstheorytolerance

I thought that I had this pretty much figured out over 30 years ago during my initial schooling, but various answers posted to a support forum of a major semiconductor manufacturer(name withheld to protect the possibly innocent) by staff support engineers (and no countering posts) have caused me to second guess myself. There are other questions concerning resistor tolerance, but none even mention the problem I talk about below.

EDIT: This is concerning the statements that the chip makers support engineers make concerning why there are differences between measurement runs. The phrasing used makes it sound like resistors change randomly within their tolerance rating. When nobody called them on this explanation(which was given over a year ago, which is why I did not post in the thread.), given by two different people, is when I came asking. The chip in question is a type of ADC. The excitement signal is give or take a 125Khz, 2Vpp sine wave with a max current around 30mA.

I always thought a resistor's tolerance rating was the permissible difference between its actual value and its posted value.
Meaning if we took a group of 100 ohm resistors they could vary as follows:
A 5% resistor could have any value that was -5 or +5 ohms from 100.
A 1% resistor could have any value that was -1 or +1 ohms from 100
And a .1% resistor could have any value that was -.1 or +.1 ohms from 100
But once it was manufactured, and discounting abuse its value would not change, due to the tolerance rating. I'm not talking about changes due to temperature, that is based on its TCR. E Nor am I talking about aging, these are measurements taken within an hour(or less) of each other.

What caused this question is that more than once in answering a question they stated that the various different values between measurement runs could be due to a resistors "tolerance". As if a resistor's value changes randomly within its tolerance rating, something I have never heard before. If they had stated it might be due to resistance changes due to temperature, I would have thought nothing of it. Or if the poster had been someone other than a tagged support engineer. And there were no posts questioning these conclusions, which messed with my head even more.

Maybe I slept through that part of class and it's such a common thing no one bothers to talk about it, but I'm pretty sure that a resistor's value does not change if there are no outside forces(e.g. temp) influencing it. Am I correct, or have I some how managed to misunderstand how a resistor works for a LONG time? Or does tolerance have a meaning I somehow just have not encountered before?
I do have medical problems, but I'm fairly sure that they have not degraded my thought processes that much(if they have, I need to stop talking).

Best Answer

A decent manufacturer will specify pretty clearly in the datasheet what is meant. For example, here's the relevant table from one vendor I've used:

enter image description here

As you expected, the resistor tolerance is the limit on the resistance at 25 C, and the variation over temperature is covered by a separate TCR specification.

But remember the resistance can also change due to other environmental factors, such as prolonged operation at high or low temperature, operation at high voltage, mechanical stress, etc. If any of these stresses is applied to your device you could see its value vary from day to day, with the value at 25 C remaining within the specified tolerance limits.