Electronic – Computer power supplies usually have higher efficiency on 230V than on 115V. Why

ac-dcefficiencypower supply

Example

As seen here in the example, this particular power supply (and most others too) has higher efficiency when running on 230V. Given that computer power supplies are usually required to output a combination of 12V, 5V, and 3.3V DC, why is it that stepping down from a higher AC voltage is more efficient? It seems counter-intuitive.

Also is this a result intrinsic to the process of converting AC to DC, or is it a compromise that manufactures settle with for compatibility? In other words, if someone is to build a power supply that only works on 115V, is it more difficult to achieve the same efficiency as one built only for 230V?

Best Answer

As the law \$ P = U * I\$ , to acheive the same power at lower voltage, you need to increase the current.

In resistive components, like wires, pcb traces, transformer wire (green), losses increase to the square of the current, as \$P(loss) = R * I ^ 2\$.

In switching components and other diodes/rectifiers, (Green) the losses equal to \$ P(loss) = V(bandgap) * I \$. V is bound to the component regardless of the voltage input, like ~1V for a rectifier.

Eddy currents losses (Red) will also increase in any core as the current (and thus electromagnetic field) increases.

enter image description here

Losses related to capacitor leakage are negligible.