Electronic – inconsistent electricity meter readings

amperagemetervoltagewatts

I have the electricity meter pictured.
electricity meter

When I attach various devices I get readings (amps, volts, watts) that are proportional. Expected values are a few percentage points off from the readings.

For example an ethernet switch with a pair of servers on standby: 0.865A, 121.2V, 97.4W versus an expected 104.8W, which represents a 7% difference compared to the midpoint.

A pair of servers on standby gives worse results. 0.291A, 121.9V, 25.6W. The wattage is compared to an expected 35.5W which represents a 32% difference compared to the midpoint.

When I attach a 22 inch LCD monitor I get readings of 0.478A, 120.9V, and 36.7W. Note that I*E = 57.8W and P/I = 76V and P/E = 0.3A. The expected wattage represents a 45% difference compared to the midpoint. These results are worse yet.

The label on the monitor says 100V-240V 50/60Hz 1.5A.

Why would the meter produce such readings?

Best Answer

Your appliances are not using the AC waveform evenly.

A heater or incandescent bulb would draw evenly -- at any given instant, amps would be proportional to volts, and the numbers would be as you expect. Try it on a heater.

But everything you mention is an electronic load, with a switching power supply. They tend to draw their highest current on the shoulders of the AC sinewave, drawing less current in the higher-voltage center of the sinewave since their goal is constant watts.

So if a meter is taking thousands of samples per second, you might have a load that draws 30 watts truly, but draws a peak 1A on the shoulders and averages 0.4A (given that its ampere draw is close to nil at center and zero-crossing). Which one does your meter report?

VA

It would be neat if the power company/grid could only generate the 30 watts that you're actually using. But that would be a big mess to distribute (lots of harmonics that transformers aren't tuned for)... so very roughly speaking, the power company must generate and distribute that full 1A at 120V. How do we describe that in material terms? We can't say watts since that unit is used for the actual draw, 30W. So they created a new unit called VA - Volts x Amps.

A difference between Watts and VA is typical of switching power supplies, especially cheapies. Switchers can be designed to draw evenly like a lightbulb, but it costs more.

Power factor

Power factor is an old measurement designed to reflect how inductive loads' current draws are out-of-phase with voltage. This meaning has been stretched to describe this situation.

Power factor is defined as watts / VA.  

So this hypothetical 30W, 120VA switcher will have a rather terrible 0.25 or 25% power factor. VA is unitless - a ratio - since both units are derived from volts*amps.

A heater or incandescent bulb has a power factor of 1.00 or 100%.

So - how is this meter measuring? You'd have to consult the manual or the manufacturer to see how it deals with VA and power factor. My name-brand Kill-a-Watt will display watts, VA and PF directly.

Related Topic