How to measure RF power with an oscilloscope

measurementoscilloscopepowerRF

I'd like to measure the RF power coming from an RF amplifier using an oscilloscope.

The signals are sinusoids with a frequency of 50-120 MHz. The maximal power coming from the amplifier is 20 W, which is attenuated by a -3 dB RF attenuator. This power is delivered directly to the load (50 ohms), which is an ultrasonic transducer. The cables used are SMA coaxial cables.

I'm planning to put a T-splitter between the load and the attenuator, and use that to send the signal to a high frequeny digital oscilloscope set to 1 megaohms.

However, the oscilloscope can only handle 9 V RMS on its inputs at such a high frequency. So I thought that I could use a compensated 1:10 passive probe to attenuate the signal. But now I see that these probes aren't designed for frequencies like 100 MHz, and to cut the coaxial cable arrangement (for the probe) may not be the best idea, either. What is your opinion?

Alternatively, we've got some 50 ohms -3 dB RF attenuators, so I could wire them in series and put them between the T-splitter and the scope set to 50 ohms. Is this going to work? Won't they present a different load to the amplifier? Do I need to set the scope to 50 ohms in this case, or can I leave it on 1M (so it can handle a greater voltage)?

Best Answer

As your signals may go up to 120MHz, it's best to leave the scope on 50\$ \Omega \$ input impedance.

Ideally you'll split the power with either a directional coupler, a 6dB power splitter, or a Wilkinson divider. To preserve accuracy you want something that is matched on all ports.

Then stack your 3dB attenuators in series until you are down to a power your scope input can handle.