Electronic – How is a 50Ω “back-termination” accomplished on a scope probe line

measurementoscilloscopeprobeRFtransmission line

I've just been watching Jim Williams' very excellent video on Measuring Switching Regulator Noise. At 1:58 Jim mentions a "50Ω back termination" in the probe setup that looks something like this (but not optional as it is in Linear app note AN-104 from which this diagram is taken):

enter image description here

It is a series 50 ohms, not parallel as at the scope connection, and as I understand it, its role is to absorb any reflections that might travel back from the scope.

My question is: "How is a 50Ω back termination typically accomplished in a scope probe line?"

I kind of suppose there's a through-termination that looks roughly like the common parallel ones, but instead of 50Ω between the center connection and ground, it is 50Ω between the center connection on either side. But I'll be darned if I can find such a thing on search.

Am I not using the right search term ("50 ohm series OR back termination oscilloscope"), or does such a thing not exist as I imagine and folks just solder a 50Ω resistor at the end of a piece of coax or something? 🙂

Best Answer

When probing high frequency signals, the standard way to allow an arbitrary length of cable between the device under test (DUT) and the scope, is to make the scope 50\$\Omega\$ input impedance, and use 50\$\Omega\$ cable.

In the ideal world, that will be good enough, Because the cable is terminated by the scope correctly, no reflections will occur at the scope, so no reflections will make it back to the driven end of the cable. The input to the cable will present a 50\$\Omega\$ load to the device being measured. We can choose to drive that load how we like.

However, in the real world, both scope and cable have a tolerance, and there will be some reflection. At very high frequencies, that could be quite large. Making the drive to the cable approximately 50\$\Omega\$ absorbs whatever does come back, improving the frequency response dramatically.

The 'tidiest' way to make this happen is to arrange for your DUT to have a 50\$\Omega\$ output impedance, to a connector. If the source of signals is low impedance, like the output of a power supply for instance, then a series 50\$\Omega\$ resistor will do nicely. If it's not convenient to use a connectered jig, then solder a 50\$\Omega\$ in line at the end of the cable.

Knowing what I did about matching, I was then surprised on my first day in a microwave lab to be shown how they probed circuits. A 50\$\Omega\$ cable, with a 470\$\Omega\$ carbon resistor soldered to the end. This was the -20dB probe.

Remember I said the input to a cable properly terminated by the scope looks like 50\$\Omega\$. The 470\$\Omega\$ resistor in series with this gives a roughly 10:1, or -20dB pot-down. It doesn't need to be matched at the sending end. It would have a flatter frequency response if it were, but another 50\$\Omega\$ resistor at the probe end would complicate the probe (obviously the cable ground is grounded to the circuit at the 'same' point, size matters!), and decrease signal or increase circuit loading for the same pickoff. For most measurements it was flat enough, and was the right price!