# Electronic – Do capacitors increase voltage

capacitorpower supply

Just got a (220<–>12-0-12) transformer hooked up with a bridge rectifier and it measured 13 volts DC output from rectifier, but when I added a 1uF capacitor it just jumped up to 20 volts, and same reading(20 volts) from a 0.1uF capacitor, how is that even possible!?

Note: Nothing is connected in the circuit more than a transformer and a bridge rectifier in the first case, and only a capacitor added in the second case along with the voltmeter.

\$V_{rms}\$ vs. \$V_{peak}\$. The peak voltage is \$\sqrt{2}\approx 1.4\$ times larger than the average (RMS) voltage.

If you put a capacitor on a rectified AC waveform it will smooth out the supply. If there is no load, it will smooth it out to around the peak voltage of the supply.

A 220V AC supply is 220V RMS, which is equivalent to 311V peak. If you bring that down to 12V AC, that is equivalent to about 17V peak.

As to how you got 20V, either your meter is dodgy (unlikely) or the supply voltage is higher than you thought, or the transformer is not the ratio you think it is.

If you add a load, the voltage will drop because the average voltage supplied is the RMS value - the capacitor can't sustain a current at the peak voltage because that would require it to deliver more power to the load than is being delivered by the transformer.

That the rectified supply is varying between 0 and \$V_{peak}\$ with the average voltage being \$V_{rms}\$. If you are drawing a current, the capacitor will smooth this out to be around \$V_{rms}\$, but if you aren't drawing any current, then it will keep getting "topped up" with charge until it reaches the peak voltage. As a demonstration of this, try putting, say, a 1k resistor at the output of your supply, you should see the voltage drop.