MHz oscillator probe impedance

impedanceprobeRF

I find something weird going on in my 100MHz oscillator probe.

I always thought the probe has a resistance of 1M ohm, but apparently at 13MHz, when I put a 2.2k ohm resistor in series with the probe, the reading is severely reduced. Does this mean that the oscillator probe is actually something around 50 ohm or so? But that would be impossible because 50ohm connected to the circuit would alter it's operating point too much.

Best Answer

The capacitance of the probe, combined with your 2.2k resistor, create a low pass filter. Can you please report what is the probe capacitance? It may be marked on the probe. If the capacitance is 20pF, for example, then the cutoff frequency would be around 3.6 MHz. This means that a 13 MHz signal will be attenuated quite a bit.

If you want to test the probe resistance, you should measure the amplitude of a DC source, then put a 1M resistor in series and measure again. If the probe resistance is 1M, then the amplitude will be half with the 1 MHz resistor in series.

Noted that your oscilloscope is labeled as 13 pF. Brian Drummond added this in the comment section, and I am including it in the answer:

That 13pf may be the scope itself ... add 30pf or so for the probe and its cable. So the probe loads (attenuates) the signal even more than that. Plus, if you're probing a tuned circuit, the effect will be even greater because you're detuning it... For 13MHz you NEED a 10x probe. The 10:1 attenuator is right at the pin, and keeps the loading capacitance down to maybe 3-4 pf (see probe spec).