Electronic – Transformer voltage rating vs open-circuit voltage: what’s the typical load assumed for the voltage rating

transformer

This appears to have been asked in some way before (at Transformer: loaded vs open-circuited vs short-circuited) but both the question and the answers there are rather vague to me, so I'll try to be more explicit.

As anyone who was toyed a small transformer might have noticed, the output/secondary voltage in "open circuit" (actually seeing the megaohm-level resistance of the DMM/voltmeter as load) can be much higher than when the transformer is under significant load. For example, I get 8V on a "5V" transformer (it's built on an EI-30 core, so you get an idea of its approximate power rating; somewhere around 3VA). So, my question is: what's the load typically used to rate the voltage "faceplate" output of such transformers? (I assume this might be mentioned somewhere in IEC and/or US transformer standards, if you know more precisely please let us know).

Best Answer

The nominal output voltage should be the rated voltage with nominal input voltage and the full load (resistive) as rated.

In other words, a 12V 300mA transformer should have 12V RMS output with nominal input and a 300mA resistive load. For loads less than the rated load, the voltage, of course, will be higher.

Edit:

The regulation of a transformer is typically defined as:

Regulation(%) = \$ \frac {V_{open} - V_{full.load}}{V_{full.load}}\$

Large high power transformers might have regulation of a few percent, cheap small transformers maybe 5 or 10x worse.