Electronic – Effect of decreasing input voltage of transformer on output voltage

picpowertransformervoltage-regulator

I have designed a device which is controlled with a PIC micro controller.This micro's power is +5 volt. I have used a transformer for converting +220 volt to lower voltage +15 volt. Then a 7805 regulator generate +5 volt which is used by micro and other components. Now i want to know what happens if input voltage drop down very high. For example in bad situation maybe accidentally city voltage +220 drop down very high and goes up.
If input voltage drop down output voltage of transformer will drop down consequently and in high current +5 volt does not generate. This affect on micro and micro will be turn off.
This is very bad situation for my device. How could this be protected? is it true using transformer? Is there another solution for protecting from decreasing input voltage?

Best Answer

The output of voltage of the transformer will be proportional to the input voltage. If it is 15 VAC at 220 VAC in, then it will be 13.6 VAC at 200 VAC in, for example.

You seem to have a lot of headroom, so should be able to tolerate significant power voltage sag. In fact, you seem to have so much that your supply is quite inefficient. At 15 V out, the peaks of the waves will be 21.2 V. Assuming a full wave diode bridge, you loose about 1.4 V leaving 19.8 at the peaks. This is the level that the capacitor after the full wave bridge will be charged to twice per line cycle, which is every 10 ms for 50 Hz power.

The 7805 regulator needs about 7.5 V in to make reliable 5 V out. There is a lot of room between the 7.5 V minimum the regulator needs and the 19.8 V peaks the cap gets charged to. The cap voltage will drop between the peaks depending on the cap size and the current. Usually things are sized for a few volts drop. For example, let's say you have a 1 mF cap and the current draw is 100 mA. That would only sag 1 V over 10 mS, which brings the lowest voltage into the 7805 down to 18.8 V.

We can work this backwards and see what the minimum necessary line voltage is in this case. You need 7.5 V into the regulator minimum. The sag between peaks is still the same because it is only a function of the current draw and the capacitance. That means the cap needs to be charged to 8.5 V at the peaks, which requires 9.9 V AC peaks before the full wave bridge, which is 7 V RMS. That is 47% of the output at 220 V, so your minimum input limit is 103 V.

Of course you should always leave some margin because stuff happens, but any vaguely reasonable "220 V power" line isn't going to sag as low as 103 V unless something is very wrong.

As for another solution, get a "universal input" power supply. These are designed to work with any of the worldwide house power. They are usually specified for something like 90-260 V AC, 50 or 60 Hz. They are switchers, so are a lot more efficient than your linear regulator. This is really a better answer anyway than a big iron line power transformer type power supply.