What’s the right kind of reference to calibrate a measurement instrument

calibrationloadmeasurement

I have been working in a calibration system for a multimeter I'm developing.

To do this I'm calibrating the reading values using Fluke meters (basically my instrument has to match the Fluke ones).

However at college I was told that this is not the right way to do this, that I'm assuming the Fluke meters are on the right and have no error margin, so I was told to do the calibration using normalized loads.

I don't know what are or where to find these "normalized loads" so I came here to ask, what's the right way to calibrate a measurement instrument? Where can I find these kind of references?

Best Answer

There are two entirely separate issues here, precision and accuracy.

If you adjust one meter to read the same as the other, you are adjusting the second to have the same precision as the first. That is, if A reads 9.781 volts and B reads 9.781 volts, each is providing the same precision.

However, if both are measuring a 5.221 volt source, both are equally inaccurate.

So - are you trying to calibrate your meter to "the fluke ones" (matching precision) or are you trying for accuracy? If the first, you can use almost any standards you like as long as you are sure that they don't drift with time, temperature, or anything else. You measure the standards with "the fluke ones", record the readings, then adjust your other meters to match.

If you're trying for accuracy, you'll need to do one of two things: either buy or build a good reference, or rent or borrow a very good, high-resolution meter to find the errors in "the fluke ones". You should be able to find either solution at a test equipment house. There are many online.