Electronic – Basic working principle of transformer

transformer

In a transformer power is conserved and p=vi. So, in a step up transformer, if secondary voltage is increased the secondary current has to decrease. Now, my question is, if the secondary winding in connected to a constant load, v=ir rule implies, if secondary voltage is increased the secondary current has to increase. These two sentences contradict each other.

Best Answer

To make it easier to understand lets make a couple of simplifications:

  • we have an ideal transformer
  • That transformer has no limits of power transfer.

This ideal transformer will transfer energy form the primary to the secondary with fixed ratio of voltage in to voltage out. Really it's the ratio of the windings, but lets just assume for now that the voltage is an easy 1:N for every 1 volts in you get N volts out.

You place your resistor across the output and it dissipates power. You know the voltage, therefore you can measure/calculate the current on the secondary and get the power dissipated. Because the voltage of the primary is less and the power transferred (remember it is idea) is exactly 1:N then the current will increase by NX on the primary.

If you change the load resistance the current(s) will change.

If you remove the load, the secondary current drops to zero. 0 X N = 0 and therefore no current flows in the primary.

The right way to look at it is that the primary current flow will "see" the secondary's load at a scaling factor of \$N^2\$. The current changes appropriately.

Now for the real part: real transformers have limits on the amount of power that they can transfer. This is the "watt rating" so pulling 10 watts through a 40 W transformer, you can use those simplifying assumptions and call it ideal. AS you get closer to the 40 W limit then non-idealities come into it.