Electronic – Why are oscilloscope input impedances so low

input-impedancemeasurementoscilloscopeprobe

My question is two-fold:

Where does the input impedance come from?

I'm wondering where the input impedance of your average multimeter or oscilloscope comes from? Is it just the input impedance to the device's input stage (such as an amplifier or ADC input stage), or is it the impedance of an actual resistor? If it is the impedance of an actual resistor, then why is there a resistor at all? Why not just the input circuitry?

I measured the input impedance of my oscilloscope with a DMM. When the scope was turned off, the DMM measured about \$1.2\mathrm{M\Omega}\$. However, when the scope was turned on, the DMM measured pretty much exactly \$1\mathrm{M\Omega}\$ (I could even see the 1V test input applied by the DMM on the oscilloscope screen!). This suggests to me that there is active circuitry involved in the scope's input impedance. If this is true, how can the input impedance be so precisely controlled? Based on my understanding, the input impedance to active circuitry will depend somewhat on the exact transistor characteristics.

Why can't the input impedance be much higher?

Why is the input impedance of an oscilloscope a standard \$1\mathrm{M\Omega}\$? Why can't it be higher than that? FET input stages can achieve input impedances on the order of teraohms! Why have such a low input impedance?

I suppose one benefit of a precise standard \$1\mathrm{M\Omega}\$ is it allows 10X probes and the like, which would only work if the scope had a precise input impedance that wasn't unreasonably large (like that of a FET input stage). However, even if the scope had a really high input impedance (e.g., teraohms), it seems to me that you could still have 10X probes just by having a 10:1 voltage divider inside the probe itself, with the scope measuring across a \$1\mathrm{M\Omega}\$ resistor inside the probe. If it had an input impedance on the order of teraohms, this would seem to be feasible.

Am I misunderstanding the input circuitry of a scope? Is it more complicated than I'm making it out to be? What are your thoughts on this?

The reason I thought of this is that I've recently been trying to measure the common-mode input impedance of an emitter-coupled differential pair, which is much larger than the scope input impedance, so it made me wonder why the input impedance can't be larger.

Best Answer

I would say a combination of a few factors.

  1. The input stages of an osciloscope are a difficult compromise. They need to be have a wide range of gains/attenutations, they need to be tolerant of user errors, and they need to pass high bandwidths. Adding a requirement for a very high DC resistance would just further complicate matters. In particular attenuators needed to handle the higher end of the scopes input level range would get much more complex/sensitive if they needed to have a very high DC resistance.
  2. It's a de-facto standard, changing to something else would lead to incompatibilities with existing probes etc.
  3. There wouldn't be much benefit anyway.

To further explain point 3, at moderate frequencies (from a few kilohertz upwards) the 1 megohm DC resistance of the scope input is not the dominant factor in the overall input impedance. The dominant factor is the capacitance, with the cable making probably the largest contribution.

(in fact at UHF/microwave frequencies it's common to reduce the scope input impedance to 50 ohm, so the inductance in the cable can balance out the capacitance and the cable becomes a properly matched transmission line)

What this means is if high input impedances are desirable then it's much better to deal with that at the point of probing than at the scope. The typical compromise of cost/flexibility/input impedance for general use is an x10 passive probe.

If you need a really high DC resistance then the solution is to add a FET based amplifier in front of the scope, preferably as close to the point of measurement as possible.