Electrical – Which ammeter to pick for best accuracy and why

current measurementmeasurement

This is merely a theoretical question.
If we want to achieve the best accuracy possible while measuring electrical current and we have two digital ammeters available from two different manufacturers, which both claim to be accurate by the same order of magnitude, in what way can we determine which ammeter is more accurate?
This is one question from my Electrical Measurements book. The only way I can think of is taking multiple measurements with both ammeters, calculating the mean value of measurements for both ammeters and comparing that value to the correct value (best estimation). And whichever one has lower absolute mistake (\$ \Delta I=I_{mean}-I_{correct}) \$ is a more accurate one. Am I right? Or there are other things to consider?

Best Answer

That procedure will tell you which meter happens to be better calibrated for that particular value of current at that moment in time.

The manufacturer's specifications (assuming they aren't lying) cover the full range of measurements, considering things like the absolute accuracy of the internal reference, converter nonlinearities, component matching and how all of this changes over time (aging).

You can't assume that any given meter is better than its specifications based on a few measurements.