Three questions about the use of a transformer in switching power-supply

mainspower supplyswitch-mode-power-supplyswitchestransformer

In this document (an introduction to switching power supplies) a boost-mode switching power supply is presented: it has a simple inductor, diode, capacitor and load. Then (page 9) it is specified that this device can be used only for input voltages not greater than 42.5 V, because there is no physical isolation between the input voltage and the load.

1st question: why greater voltages would require an isolation? That is: why flyback converters are used instead of buck-boost converters in such cases?

The solution is to use a transformer instead of the inductor. Suppose that the input voltage \$ V_{in} \$ is the 220 V AC mains rectified, which could be a \$ \simeq 311 \ \mathrm{V} \$ DC voltage.

2nd question: if the input voltage is so high, why can the transformer in a switching power supply be smaller than that of a traditional (not switching) transformer (which first drops and then rectify the 220 V AC signal)?

The output of the switching power supply depends on the duty cycle of the switch, according to equation 1, page 6 of the forementioned document. I know (that means: I heard) that it would be difficult to obtain a precise duty cycle with a 311 V DC input (which would be the result of the 220 V AC rectified).

3rd question: why should be difficult to obtain such a precise duty cycle? Is this an importat reason to choose to insert a transformer which drops the voltage before the LC filter?

Best Answer

if the input voltage is so high, why can the transformer in a switching power supply be smaller than that of a traditional (not switching) transformer (which first drops and then rectify the 220 V AC signal)?

Because the operating frequency is so much higher than 50 or 60Hz. With a higher operating frequency, the inductance of the primary can be proportionately smaller requiring fewer turns and a smaller core size because core saturation is caused by ampere-turns. It's not unusual for a switcher to run at 200kHz i.e. 4000 times higher than 50 Hz.

It's all about core saturation, even for a laminated mains transformer - the current in the primary when the secondary is unloaded is the current that saturates the core. The magnetizing inductance of the primary is proportional to primary turns squared so if you double the turns you quadruple the inductance and quarter the magnetization current. So amps have gone down by 4, turns have doubled but, importantly, ampere-turns have halved.

(Generality alert) This is why a 230V mains transformer needs about 1000 turns on the primary - it needs to maintain a primary inductance of about 10H. This then limits current to about 73mA and ampere-turns will be about 73 At. For a given core size, the mean length of the mag field might be 300mm so, mag field strength, H will be about 244 At/m. At this field strength iron and ferrite and other such transformer core materials are seeing a peak flux density that is on the cusp of beginning to cause core saturation. A bigger core means less saturation. More turns means less saturation. Higher frequency means less saturation. Weigh this against the down sides. More turns = more cost and more copper losses. Bigger cores means bigger product and more cost. higher frequency means smaller core, fewer turns, smaller cost BUT cost of control electronics to run this higher frequency has to be considered.

why greater voltages would require an isolation? That is: why flyback converters are used instead of buck-boost converters in such cases?

It's a legislative safety thing as mentioned on page 8