Electronic – theoretical limit to power supply transformation efficiency

current-limitingefficiencypower supplytheoryvoltage-regulator

As I'm looking at some resistive-load applications I'm wondering whether there are any rules that put an upper-limit on supply-matching efficiency. Because if a load isn't matched to the power supply in terms of voltage, current, and waveform (typically AC vs. DC) then we have to add all sorts of components, and depending on how mismatched the supply and the load are we may end up cooking off a lot of waste power to bring them into alignment. To what extent is this just for design simplicity, and to what extent is it unavoidable?

  1. Is there some minimum waste power cost involved in stepping voltage up or down? My understanding is that in the AC realm it is negligible even in practice with transformers, as it is in the DC realm with multipliers. But are these limited to cases where the input and output are multiples, and it gets more expensive in terms of waste power to hit a specific voltage that isn't an integer (AC) or binary (DC) multiple of the input?

  2. Current seems to be an efficiency killer. I.e., once current in a circuit becomes significant it seems like there's no way to control it without generating a lot of waste. What theoretical constraints are there are on current throttling efficiency, and on what variables do they depend?

Best Answer

There is no theoretical limit other than the laws of thermodynamics. Efficiency cannot be 100%.

Typically switching supplies designed to a reasonable cost have an efficiency of >80% over a range of outputs and inputs. Typically efficiency falls as the output load is reduced because there is some overhead in the supply, and it may fall a bit at the highest rated current as I^2R and other losses mount.

Transformers can be very low loss (especially if they are designed for low losses) but that is not necessarily true - in fact relatively recent State of California legislation basically outlaws the sale of many cheap transformer AC adapters because their losses are higher than switchmode types, especially at low loads. Some run warm to the touch even with no load. As an Engineer we have some control over this- by specifying the better (and more expensive) laminations the transformer can have dramatically lower core losses, but this is unsuitable for a product that must sell for a very low price.

In your earlier question you were asking about linear supplies, which generally (but not always) have a low efficiency. A perfect linear regulator supplying 1A at 1V from a 12V source has an efficiency of 1W/12W = 8.3%, which is much lower than 80%. On the other hand, a perfect LDO (linear regulator) supplying clean 5.0V from a noisy 5.6V source has an 89% efficiency, possibly better than a switching supply.

Current in itself is not a problem- supply currents in modern PC motherboards are probably in the 100A range (power of a CPU alone might be 120W+), and the supplies themselves do not require heat sinks.