Electronic – How would the input impedance of the instrument affect the measurement

impedance

I have an instrument (a Lock-In Amplifier, a spectrum analyzer, etc.) with some input impedance, 1 M Ohm. I use these to measure a sample of a much higher resistance, 100 M Ohm.

What is the affect of having the input impedance of the instrument much smaller than that of the load?

Best Answer

Those high impedances suggest that you are interested in voltage rather than current. For voltage, your measuring instrument loads down the source, reducing measured voltage. A Thevenin equivalent is used to model the source.
For current, the source model is a Norton equivalent, and its current is divided between the 100M source and the 1M instrument. Most of the current in this case goes into the instrument.
There are other effects. If your 100M source is connected to lock-in and/or spectrum analyzer with coax, you must consider cable capacitance in determining effective bandwidth and phase response. A large instrument input-Z results in a low-pass filter whose pass band is small. A lower instrument input-Z results in a smaller voltage signal, but extends the pass band. In some older lock-ins, signal harmonics contribute to measured signal. Your phase and amplitude response may change with cable length.
Since these instruments are often used to measure signals where noise is significant, the instrument loading effect may make the instrument noise floor dominate noise from your source. Lock-in noise is usually spec'd in terms of volts-per-root-Hertz, like 6 nV/(rt Hz). You can measure very small noise levels by extending its time constant, and pay the penalty of extending your measurement time.
Like lock-ins, spectrum analyzers can measure smaller signals by reducing the resolution bandwidth, but sweep times must be reduced, again extending measurement time.