Electronic – Accuracy of a multimeter over 10 years period

accuracymultimeter

Datasheets of multimeters contain accuracy specifications.
One parameter is usually accuracy over period of 1 year. I understand that it means multimeter can be off by the specified value.

For example Keithley 2000 for 100mV range:
1 year accuracy = 0.005% (of reading) + 0.0035% (of range)

or Siglent SDM2055 for 200mV range:
1 year accuracy = 0.015% (of reading) + 0.004% (of range)

But question is what accuracy I have to consider over period of 10 years?
Do I have to multiply "1 year accuracy" by 10? Or it is not nearly that easy (and not that bad)?

No hobbyist is going to do calibration of his equipment every year. It would be useful to know how does the accuracy shift in longer period of time.

Best Answer

Generally that figure is defined because you are supposed to calibrate your equipment annually.

If you don't.. all bets are off.

You can not extrapolate from one to the other, plus aging will not be linear.