I'm trying to get my head wrapped around transformer operation and in the process regretting the times I snoozed in my Electromagnetics class as a EE student back when I was a lad 🙂
I'm looking for an intuitive understanding, but not just an analogous one. I'd like it to be grounded in the actual physics of what's happening. I've found several excellent sources on the web, but they all seem to skirt this question.
I've come across a few interesting hints and am now tantalizingly close,
I think, but still yearning 🙂
Fact 1: Although varying sinusoidally, the "peak-to-peak" flux, so to speak, in a transformer's core is essentially constant (for a given voltage applied to the primary), regardless of the load.
My intuitive hypothesis was that variation in the "strength" of the flux was what transferred the energy from the primary to the secondary, but this fact would seem to
contradict that theory. I had thought that the primary makes a bunch of flux based on the current flowing through it and the secondary sucks it up to make current of its own. No dice, it seems.
Then of course there's the fact that the formula for flux involves only
voltage, time (frequency), and turns 🙂
Fact 2: The current in the primary is (approximately) 90 degrees out of
phase with the voltage at no load, and approximately aligned in phase at full
This fact seems very promising and also curiously satisfying. It would imply
that the Volt-Amps (VA) of the primary is constant and only the power factor
changes as the current load on the secondary increases.
But I still don't get how the energy is actually being transferred. It seems
vaguely like the flux is just there as an energy conductor or something and
some other phenomenon is actually doing the energy transfer bit.
Can someone see what I'm missing and explain what's actually happening in