I have just recently come to understand that for maximum power
transfer the amplifier and speaker impedances should match.
Not this is not the case with audio - an audio amp can have an output impedance that is substantially lower than 1 ohm yet nobody makes (as far as I know) 1 ohm speakers. If the amp had an 8 ohm output it could only deliver half the voltage to an 8 ohm speaker and the rest of the power would be wasted in its output impedance.
It's only RF circuits that you need to be concerned about matching impedances but this is more to stop reflections down PCB tracks and cables.
The rest of your question is based on a false premise about audio impedances so it's not worth attempting to answer. However I will try and give some insight about the transformers used.
Like any power transformer without a secondary load, ideally you want to be able to apply a voltage to the primary and have zero current entering the transformer - that would be perfect and, when you connected a secondary load that consumes power, the power needed to be input to the primary would be identical to that consumed by the load. Reality isn't that bad actually.
Primary magnetization inductance is basically what the primary impedance is when the secondary load is disconnected - it can't be infinite but it can be relatively small but, not as small as a speaker impedance because then a lot of the power amp's energy is wasted driving a reactive current that serves no purpose.
If it were a 50 Hz power transformer connected to 230V ac, a 10 henry mag inductance would take a "standby" current of 73 mA. If such a transformer were designed for audio and, you weren't too bothered about below 100 Hz (deep bass) then a 10 henry inductance would take 35 mA at 100 Hz BUT, it's possibly a 20V RMS drive and not 230VRMS so a 100 mH mag inductance would do and it has an impedance of 63 ohms at 100 Hz. This, of course will only get higher (better) as the audio frequency rises into the mids and the treble.
63 ohms is fine for an amplifier that can drive an 8 ohm speaker so that hopefully takes care of that side of things. Next - there are turns (windings) on the primary that do not couple power to turns on the secondary and these can be a right royal pain for audio transformers because they are in-series with the power transfer and at high frequencies these "leakage" inductors are going to somewhat attenuate high frequencies. The bottom line is that audio transformer designers try to make sure that approximately 99.5% of the magnetic flux in the primary is coupled to the secondary so, if the primary is nominally 100 mH open circuit then less than 500 uH is seen as useless to the transformer and a detriment to high audio frequencies.
Even so, 100uH as a blocking impedance is nearly 13 ohms at 20 kHz.
Bottom line is that audio transformers are really good at providing low loss power transfer across a wide range of frequencies. No impedance matching is necessary.
My question is that, is there any guidelines for choosing the value of
source resistance?
Given that you want to send X watts to the transducer and that your driver has an amplitude of Y, source resistance is chosen to be what it has to be to both prevent the transducer from becoming damaged (read its data sheet) and/or to protect the driving source from damage due to excessive currents.
It's called "the design process".
Best Answer
It depends on how the amp is connected to the load...
If the cable is short enough relative to wavelength not to be a transmission line, and its inductance is small enough, then you will get maximum power into your load by using a low source impedance.
However if your cable is long and becomes a transmission line at your frequency, then you have to match impedances, or else power will be reflected at the load.
So, both answers are right, and the deciding factor is the cable length.