AC Watts to DC Watts Calculation

acconversiondcpower supplywatts

Okay, so I'm new to the whole electricity field and I'm trying to get a handle on something so I apologize for any ignorance displayed here.

I've purchased a new graphics card for my computer and I'm worried that when I'm maxing out it's power need, my PSU (Power Supply) will not be able to output enough power.

I understand that AC wattage going into the computer is going to be higher than DC wattage output as some wattage is lost in the conversion and that all depends on the efficiency of my PSU.

So I bought a P3 KILL A WATT 4400.01 to measure AC wattage going into the PSU. I have a PSU that, in theory, should output up to 500 watts. When I stress test the card and look at the AC wattage (Amps X Volts X PF) being pulled, it doesn't go over 450 watts. However, the AC Volt-Amperes being pulled into the system are over 600 so I'm a little confused/worried. I know, in theory, that the DC wattage being used/pulled should be lower in the system due the loss in conversion and that the system draws extra AC wattage to meet the DC wattage needs.

So, my question is, should I be looking at the raw AC Volt-Amperes going into the PSU or the actual AC wattage (Volts X Amps X PF) to base what the PSU is actually drawing?

Best Answer

In short, you should look at AC watts (V x I x P.F.) because that's the measure of "real" power being drawn from the mains. The so called Voltamps is merely the product of rms voltage and rms current.

It is called apparent power because if you measured only load voltage and current (as you might do for a DC load), it would "appear" that the load is drawing power equal to V*I.

For more insights you should read up the concepts of apparent, real and reactive power.

Addendum: The above assumes that current is sinusoidal. If your system is drawing non-sinusoidal power, then you will have to be careful about how your measurement system (KILL A WATT) handles that.