Electronic – Standard Volt and Ampere

amperagecalibrationelectricalstandardvolts

A recent question on here asking about how to calculate the accuracy of a circuit got me thinking about calibration in general.

In particular, as EEs we routinely use volts and amperes as a unit and yet these are both rather vague and hard things to quantify.

It used to be that a volt was defined by a "standard cell" kept locked up in a vault somewhere, but that changed to using the "Josephson voltage standard" which is a complex system that uses a superconductive integrated circuit chip operating at 70–96 GHz to generate stable voltages that depend only on an applied frequency and fundamental constants.

The latter is not exactly something one could throw together in a basement, or even in the test engineering departments in most companies.

The Ampere is worse. It is defined in SI as "That constant current which, if maintained in two straight parallel conductors of infinite length, of negligible circular cross-section, and placed one metre apart in vacuum, would produce between these conductors a force equal to 2×10−7 newtons per metre of length."

I have NO IDEA how anyone could measure that.

The ohm used to be defined by a specific height and weight of mercury, but that was abandoned in favor of being a derived unit from 1V and 1A.

This all brings me to wonder how much of what we use is calibrated to someone else's meter. And how many of those meters are calibrated to yet someone else's… and on and on. Seems like one big house of cards.

Is there some sort of intermediate standard of measures or equipment you can buy to use as calibrated references for 1V, 1A, and 1R? (Obviously you only need any two of those.)

Bonus question: Is there some sort of certification sticker one should look for when buying a meter or other equipment that indicates it is indeed tested to the actual SI values vs tested against, say, a Fluke?

Best Answer

This all brings me to wonder, how much of what we use is calibrated to someone else's meter. And how many of those meters, are calibrated to yet someone else's.. and on and on. Seems like one big house of cards.

and...

Is there some sort of certification sticker one should look for when buying a meter or other equipment that indicates it is indeed tested to the actual SI values vs tested against say a Fluke...

You have described precisely what happens. You don't need to have an exotic, expensive, "golden standard" in your company's in-house lab, as long as you have certified calibration traceability (if you need it) down to an accredited lab that actually has it.

And yes, they actually put in your instrument a sticker issued by the accredited lab, with the date of expiracy of the calibration validity on it. I've seen it myself.

In-house, you'll find yourself in one of these situations:

Traceability chain of calibrations

I recall that when I was working in the aerospace industry we were required to have all measurement instruments calibrated, with their sticker and traceable calibration certificates and associated documentation. The test procedures ("work standards"), instruments to be used, and their calibration traceability, all were required to be exhaustively documented and submitted for customer approval well before any actual delivery testing was performed on the product.

Of course, every industry has its own requirements and quality levels. I don't think anyone can reasonably expect that a Chinese manufacturer of cheap multimeters has such a calibration program in place, because it wouldn't make sense to them or to their customers.

Back to the traceability chain: the NMIs in the figure above are the National Metrological Institutes. The NMI for USA is the NIST. A closer look at the "food-chain" of metrology is illustrated in the figure below, in case you wonder:

Structure of traceability of calibrations

Source of images: Calibration and traceability in measuring technology