In case it matters, let's discuss solid-state, linear, class AB amplifiers operating below 50 MHz and on the order of 100 W.
Here's my confusion: I'd think ideally, the output impedance of the amplifier would be 0Ω, or at least as low as possible. This would minimize loss and the load impedance wouldn't matter much.
Yet, datasheets for these amplifiers almost always specify the output impedance as 50Ω. Sure, this means any reflections from the load will be absorbed in the source, but for HF and the kinds of things usually transmitted on HF (AM, SSB), reflections will not appreciably distort the signal.
I would think any kind of resistive 50Ω source would mean the amplifier efficiency can't be more than 50%. The 50Ω source could also be realized with reactive components which would improve efficiency, but preclude usage on multiple frequencies.
So what is the output impedance of a typical amplifier of this sort, and why?