Source/output resistance

RFtransmission line

I was wondering why many generators ect. have 50 ohm output resistance.

Before you answer here is what I know:

Many transmission lines are 50 ohm, and terminated with a 50 ohms load to get maximum power transferred to the load and minimize reflection. So for the load it is obvious why 50 ohm is needed, but why for the source?

I would get more power transferred to the transmission line (or load), if the source impedance was say 10 ohms.

Is it to cancel any reflections hitting the source from the load?

Or does it just mean that the source has been matched to 50 ohms load?

EDIT:
After searching google for hours, I found this link which explains my question:
http://users.tpg.com.au/users/ldbutler/OutputLoadZ.htm

Conclusion: It usually means the source has been matched for 50 ohms. Sometimes in precision signal generators (not RF power amplifiers), the source has 50 ohms input impedance, this is to cancel re-reflections hitting the source from the load. But for RF power amplifiers and etc., the output has been matched for 50 ohms, since half the power would be lost inside the equipment, if the output impedance was actually 50 ohms.

Best Answer

You are apparently talking about signal generators, not power generators. The signal can't usually be used immediately at the signal generator output. That means it has to be somehow transmitted to where it is needed. Since the shape of the signal is important, the manufacturer of the signal generator has to assume you will want to use a carefully ballanced transmission line for getting the signal from the generator to the load. As you say, lower impedance is better, so they assume you are using 50 Ω coax, which is the lowest impedance commonly available high frequency cable. Matching the signal generator output to the transmission line impedance allows for the least loss of signal integrity and absorbs reflections at the signal generator end due to any mismatches at the other end.