Drop on high pass filter measure

biasfilteroperational-amplifieroscilloscope

So I'm implementing an analog control scheme, and one important component is a first-order high pass filter.

The source signal is the output of a shunt monitor (INA138 – in the circuit below the current source).

In order for the output of the filter be in the range of 0-12V. I've used a 5.6V zener as bias.

schematic

simulate this circuit – Schematic created using CircuitLab

As I was measuring with an osciloscope, with no load on my shunt (so Vsource=0V) I've noted a very strange behaviour:
While the zener bias and the op-amp output was at 5.6v RMS, the output of the filter was at 4.6v!

How is that possible? Could it be that the probe is drawing current – thus the drop? This would mean that the probe has a 575K resistance, but normaly it should be of Mohms of magnitude. Am I missing something else?

Best Answer

If your oscilloscope was at the 1MOhm input state (1:1), then the input impedance would form a 1MOhm load to the 200kOhm.

If you switch your scope to 1:10, then its input impedance will go to 10MOhm!

Also, the bandwidth of the probe will improve noticeably when using the 1:10 position.

Corollary: Always use the probe at 1:10, unless the signal is really too weak.