Electronic – Calculating AC Input Power

measurementpowerpower supplytesting

I'm making an AC to DC power supply that connects to mains. There's an AC to DC converter, buck converter and a smoothing LDO that outputs a DC controlled with pots.

I'm verifying the supply, and have been trying to get efficiency measurements. Due to limited equipment, I can only measure input Vrms, input Irms, output DC voltage, and output DC current. However, if I just do the Vdc * Idc/(Vrms * Irms) calculation, my effeciency goes above 100% at the higher loads, which is obviosuly wrong.

I know there's apparent vs real power when driving reactive loads with AC, but when I'm measuring the Vrms and Irms that goes into my circuit, that is the real power, is it not? Adding a power factor would mean my circuit becomes even more unphysically efficient.

Is there anything else that I haven't taken into account?

Best Answer

The product of Irms and Vrms is not real power.

Imagine if the load was a perfect capacitor and input was a pure sine wave.

RMS voltage is mains voltage. RMS current is Vrms/Xc where Xc = \$\frac{1}{2 \pi f C}\$

Say 120VAC RMS and 100\$\mu\$F, so Xc is 1592\$\Omega\$ and Irms is 75.4mA, so we get 9W.

However real power is exactly zero.

To get input power you need to find the average of the instantaneous product of voltage and current. In other words, a wattmeter.