Electronic – Transformers vs. voltage regulators

rectifiertransformervoltage-regulator

I'll start by saying that I'm self-taught in electronics, and tend to ask a lot of questions because I don't know NOT to ask them. This is one of those questions.

Consider: I have 120VAC and need 12VDC.

On one hand, I'd get myself a wall-wart transformer that puts out 12VDC and be done with it.

But why a transformer? Why not use a bridge rectifier to get DC, smooth out any ripples with a cap, a throw in a voltage regulator…say a 7812 in this case.

Why is one method preferred over the other? I assume the rectifier/regulator solution would generate a TON of heat and perhaps need a prohibitively large heatsink. Is is engineer's choice? or is there something in the physics that makes one solution is more efficient or cheaper to manufacture than the other? or are there some other crazy unintended safety considerations?

As someone who tinkers and builds stuff, what are some of the considerations in deciding to use a transformer instead of a rectifier/regulator in a project (apart, of course from the why re-invent the wheel when you can buy a wall wart at the corner store argument)?

I'd love to understand this a little better.

Best Answer

But why a transformer? Why not use a bridge rectifier to get DC, smooth out any ripples with a cap, a throw in a voltage regulator...say a 7812 in this case.

Well for one, 120V is above the maximum input voltage specified in the 7812 datasheet.

However, let's say we find or build a linear voltage regulator similar to the 7812, but could handle such an input voltage. Why not that?

It's true of all linear voltage regulators that input current is equal to output current, neglecting some very small current for the operation of the regulator itself. This is because they work by effectively adjusting a resistance to maintain the desired output voltage.

Remember that a resistor with a current through it will also have a voltage across it according to Ohm's law: \$E = IR\$. So for whatever current is required by the load to have the designed output voltage, the voltage regulator effectively adjusts \$R\$ such that \$E\$ is the difference between the input and output voltages.

Thus, for a 120V input, and a 12V output, the voltage across the regulator will be 108V.

Remember also that electrical power is the product of voltage and current: \$P=IE\$. For the voltage regulator, \$E=108\text{V}\$ as above. \$I\$ will be determined by the load.

Let's say we have a pretty small load, and \$I=10\text{mA}\$. The electrical power in the voltage regulator is then \$P=10\text{mA} \cdot 108\text{V} = 1.08\text{W}\$. Not only is this voltage regulator already getting pretty hot, it's horribly inefficient. The power in the load is only \$10\text{mA} \cdot 12\text{V} = 0.12\text{W}\$:

$$ \frac{0.12\text{W}}{1.08\text{W} + 0.12\text{W}} = 10 \text{% efficient} $$

This inefficiency might be acceptable for very low power loads where the heat is more manageable and the cost of the input energy is affordable. However, 10mA isn't even enough to light your typical indicator LED to full rated brightness, so for most things, a linear regulator just isn't feasible.

The solution is to use a transformer, or use a non-linear voltage regulator, such as a buck converter. With these methods it's possible to convert voltages with (given ideal components) 100% efficiency.

Incidentally, the ease of doing this with AC and transformers is why Edison is a jerk and lost the War of Currents.