Electronic – How stable are digital multimeters’ parameters

measurementmultimetervoltage measurement

I live in a region where standard mains voltage is 220 volts. Recently I tried to measure it using a digital multimeter. The multimeter specs say that it has maximum error of 1,2 percent of the displayed value plus 3 units (volts in my case).

So I measured and got 241 volts which kind of scared me. According to the specs maximum error would be

241 * 1,2 / 100 + 3 = 5,892

which is about 6 volts, so the real value is in range 235..247 volts which scares me even more.

The spec further says that the maximum error is only guaranteed for the first year of the multimeter lifetime which implies that the error might be even greater as the multimeter gets older.

My question is – how much worse will the error become? The multimeter is something like 5 years old now – what measurement errors should I expect?

Best Answer

I expect your multimeter is rather more accurate than the grid voltage. Grid voltage can vary widely from the supposed nominal value. At least 10% should be assumed off the top, but more should not be surprising. 220V +-10% is 198-242V, so you're within spec. There are places in the world I would trust the power less than others, and frankly eastern Europe is in the second tier. It's not as bad as some places where outages are common, but more than 10% variation shouldn't surprise anyone.

Then there is the whole EU standardization attempt. Various countries had somewhat different power voltages, like 220V, 240V, etc. The EU wanted to standardize this. But instead of actually changing the voltages they just re-defined nominal with enough slop so everyone was within spec!