Electronic – high output impedance sensor with voltage output


I've to design an input stage for signal conditioning of a voltage signal coming from an high output impedance sensor.
The sensor output impedance is 100k, and the amplitude of the voltage signal is 0-5mV. The frequency of the signal is between 50-150 kHz.
I'd like to minimize the op-amp used in the circuit.
In terms of "circuit blocks" what architecture do you suggest?

I was thinking of using a voltage buffer for the input stage, in order to have a low impedence at the buffer output and then amplify the signal with a non inverting op-amp.

Is it a good idea?

If I had to use an opamp with a small gain bandwidth product, is there a way to externaly compensate the op-amp dominant pole and increase the GBW product?

Thanks for your help!

Best Answer

There is no reason to use an op-amp just to buffer the input- use a non-inverting configuration, for example with a JFET input op-amp.

If you want to use low GBW op-amps, you can just use more stages with less gain each.

The big problem may be that an input capacitance of only 10pF will have an impedance of about 100K at 150kHz, assuming a sine wave (worse if there are harmonics). There are techniques to get down way sub-pF using discrete parts and bootstrapping.

Minimizing the amplifier cost may not be ideal since cheap fast-ish op-amps tend to want dual supplies. Looking at the problem at a system level may yield a more optimal solution, but for cheap consider jellybean JFET amplifiers.