Electronic – Switching power supply: why is the primary current rating 1A off when calculating power

switch-mode-power-supply

So I bought a couple of switching power supplies for a project of mine and I am currently recalculating my actual power usage out of my wall socket to figure out how to distribute power. I have made sure that my power supplies will never exceed 90% of its capacity, but the normal operating will be around 40-50%.

I tried to calculate the values and compare them with the ratings of my supplies and they are way more off that I anticipated. Example:

I have a 5V power supply that can deliver 20 amps for DC output. My AC input rating for this supply says: 230V (170-264V) 1.5A. However my calculations for 240V is 0.416A. So the theoretical and stated power usage is 1 amp off. I figured there would be some waste but not that much.

So:

  • Are my calculations way off?
  • Are the producer of the power supply
    just overly careful?
  • Or does the power supply really waste the much
    energy in the conversion?

    • Does the power supply have a poor power factor ?
    • Is the part load efficiency poor ?

Best Answer

this supply says: 230V (170-264V) 1.5A. However my calculations for 240V is 0.416A.

You are comparing two different things. The supply spec gives you the absolute maximum current it will ever draw under any conditions of output load, input voltage, temperature, phase of the moon, and what you had for breakfast.

You are calculating actual current for a specific case based on output power. You are also doing it at significantly higher input voltage than the minimum. Since this is a switching supply, the highest input current will occur at the minimum input voltage. That alone accounts for a 40% reduction in input current.

Of course these power supplies aren't 100% efficient. Look at the datasheet to see how much loss there is at full output power and the worst case input voltage. That could account for another 15% reduction or so. Then there is usually a significant cover your butt margin in specifying input current. The manufacturer doesn't want to get caught with the supply ever drawing more than they said it would. Since most customer aren't going to care that the supply might draw 1.5 A instead of 1 A, the manufacturer gives the more conservative worst case value.