Electronic – How to determine the maximum possible current in a transformer

power supplytransformer

I am curious how far can I go if I'd like to rework a small 12V/0.1A DC power supply to give higher current without changing the transformer inside it.

I understand that I have to upgrade the electronic part (a different linear voltage regulator, or maybe get rid of it completely to get maximum possible output). What I don't understand is what is the limit of the transformer? How can I find that limit? Should I expect that as a soft source under heavier load there will be a significat voltage drop and nothing bad happens? Or will the transformer under heavier load dangerously overheat? Is appropriate to put there a thermal fuse (e.g. 115°C) to protect if from that overheating? You know I don't want to buy a new transformer, I just want to try to use the existing one (without datasheet).

This question is probably more theoretical than practical, I just want to make things clear for me and not actually desperately need the new power supply. For example if I tried to push the same transformer to give .2 A instead of .1 A (two times more than originally designed), what result can I expect? Since it is made of a thin wire, I expect either it will give lower voltage and nothing else happen, or it will dangerously overheat.

I know that this question is similar to this: Finding maximum current experimentally
But I think they ain't the same, just similar. My question is aimed at transformers.

Best Answer

AFAIK transformers are disigned to withstand a 180% current overload for short periods. (They can be overloaded this way for something like 15 minutes every 24 hours or so.) But in the long run overloading it with 200% rated current will most likely cause serious overheating and eventually damage. Maybe a good (meaning in your case probably water - or oil - cooling, which can be way more expensive than a new transformer) can provide the necessary cooling to enable the trafo to carry that kind of current.

Additional info: If your supply provides 0.1 A DC as you say, the AC current (RMS value) that comes out of the trafo is not the same as the magnitude of the DC current! In general, assuming the output current of the trafo is not distorted (meaning it keeps its beautiful sine shape - not necessarily true) you have for full-wave diode rectifiers (from: Mohan - Power Electronics):

\$I_{out(DC)}=\frac{2}{\pi}\sqrt{2}I_{sin(AC)}=0.9I_{sin(AC)}\$

meaning that the AC current will be actually 0.22 A if you want 0.2 A DC. But, since the trafo is seriously overloaded, I'm sure the current will be strongly distorted due to the saturation of the trafo iron core. At the end of the day, this will lead to extra heat generation and some other odd behavior.

All in all I don't really think it's a very good idea.

The above calculation applies only if you have a line frequency transformer (50 or 60 Hz) followed by a rectifier. If the circuit includes a high frequency transformer, the situation can be entirely different. So, if this is the case, please edit your question, so we can adjust the answer to that.