Electronic – Transformer overheating reasons

ac-dcflybacktransformer

I am currently designing a Flyback-based offline AC/DC converter similar to the following schematic (with a few changes e.g. sync rectifier at the output). Everything works nicely but the custom-made transformer is slightly "overheating" (~75°C) at full load (12V – 39W). Since this device will be used in a container where convective cooling is rather limited and the external ambient temperature can reach up to 60°C, the goal is to keep the temperature rise as low as possible.

After double checking, the temperature rise is indeed coming from the transformer itself rather than the peripheral components.

I am aware that a transformer can suffer from overheating e.g. if it is over-driven / saturated, however, it does not seem to be the case here.

Transformer main parameters:

  • \$L_P=320\mu H\$
  • \$L_{P(leak)}\approx 6\mu H\$
  • \$N_{PS}=6.6\$
  • \$R_{P}=414m\Omega\$
  • \$R_{S}=13m\Omega\$
  • \$I_{OUT}=4A\$

System parameters:

  • Supply Voltage: \$90VAC-264VAC\$

The questions I have are the following:

  1. Can you say that the above temperature is really excessive for this operating condition (12V@39W)?
  2. Are there any other possible reasons why it would heat up considerably?
  3. Provided that the peripheral components are kept cool (e.g. RCD Snubber), can they affect at all the thermal aspect of the transformer?
  4. What mechanism could possibly cause the losses in a transformer to increase at a higher supply voltage? I was expecting it to be larger at a lower supply voltage, due to the higher primary's current.

Since the question revolves mostly around the reasons why a transformer in such a topology would heat up, I decided to keep the details out.

Circuit

EDIT#1

Below are two measurements for \$90VAC\$ and \$264VAc\$ respectively, with both the \$V_{DS}\$(orange) of the primary's mosfet and its current sense voltage drop \$V_{CS}=I_{CS}\cdot 200m\Omega\$ (blue).

Due to technical limitations, the whole switching amplitude range could not be displayed in the osci for the higher supply voltage.

measurement_1 measurement_2

I appreciate any feedback.

Thanks.

Best Answer

What mechanism could possibly cause the losses in a transformer to increase at a higher supply voltage? I was expecting it to be larger at a lower supply voltage, due to the higher primary's current.

The main mechanism is transformer core saturation - as the supply voltage rises the losses in the core can rapidly become the dominant factor in terms of power loss.

What you should do is prove that at higher voltages on the AC supply (but with a light load of say 100 mA), the temperature rise is nearly the same - this would tell you that it is likely to be core loss, specifically hysteresis loss AND, if this is too much then a bigger core would be needed.

I am aware that a transformer can suffer from overheating e.g. if it is over-driven / saturated, however, it does not seem to be the case here.

If it does not get excessively warm when the supply voltage is at the design maximum then it isn't a core loss problem but an \$I^2R\$ problem. You do need to be absolutely sure that the MOSFET isn't producing the heat though. Ditto the snubber circuit (not shown on the circuit in the question).