Electronic – How to dimension a power supply transformer

power supplytransformer

I always hesitate when dimensioning a transformer for a DC power supply, and I guess I sometimes overdimension them. My confusion stems from the difference in transformer and DC voltage.
If I need 10V DC @ 1A, will a 10VA transformer do? My gut feeling says I need more, so I use a bigger transformer, but how much bigger does it have to be?

Best Answer

First, 10VA is not enough if you ultimately want 10V DC at 1A out. Theoretically the transformer is capable of putting out 10W, but only at a power factor of 1. If you only needed to heat a resistor with this transformer, then its spec would be just on the edge. For anything else it's not enough. Then any good engineer will add some margin anyway.

Second, even 15VA or 20VA is not enough to know the transformer can do what you want. You need a specific voltage out that has to be able to deliver 10W, not just any combination of voltage and current that comes to 10W.

Since you seem to be asking about a power line transformer, I'm guessing that you intend to only put a full wave bridge, cap, and maybe a linear regulator on the output. You need the peaks of the AC waveform after the full wave bridge to be a few volts above the target output voltage. This gives room for droop at high load and for the linear regulator to do its job. Figure the full wave bridge will drop 1.5 volts under the full 1A load and maybe 2V for the linear regulator. From this alone the AC peaks need to be at least 10V + 1.5V + 2V = 13.5V. Accounting for droop under high load is more tricky. In theory, the transformer output voltage rating is under full load, but often not specified for the worst case line voltage input. This is where you have to look at the transformer datasheet carefully. Then there will be a voltage drop between peaks as the current is drawn from the storage cap instead of directly from the transformer. So far we need a minimum of 13.5V / sqrt(2) = 9.5V AC sine out before accounting for the drop due to low line voltage and droop between line cycles. It sounds like a 12V transformer is probably the minimum, assuming a reasonbly sized storage cap.

For 60 Hz power line frequency, the storage cap will be charged up at a 120 Hz rate, or every 8.3 ms. Let's say we've budgeted for 2V droop at the full output current of 1A. That means the minimum storage cap is 1A * 8.3ms / 2V = 4.2mF. That's quite a lot, but doable. You can go with that or start with a higher voltage to allow more droop, which would allow for a smaller cap.

So to make a concrete recommendation, something like a 12V 1.5A transformer will likely do it with a big enough storage cap. Keep in mind this kind of power supply will be rather inefficient. The full wave bridge alone will dissipate about 1.5W, and the linear regulator more.

The above tradeoffs are good reasons you don't see direct power line transformers with "dumb" rectifier and linear regulator supplies much anymore. Even in North America, the power line is only 60 Hz, so the transformer will be big, heavy, expensive and the result rather inefficient. Nowadays you put the full wave bridge directly on the AC line, then chop that at high frequency thru a much smaller transformer to make the low voltage on the isolated side. Opto feedback to the chopper can allow the final output voltage to be regulated. This is much more efficient and can use a smaller, cheaper, lighter, and more efficient transformer because it will be operating at 100s of KHz. This is exactly what switching wall-wart type power supplies do.