Electrical – How to connect a 0-10V analog input to MCU with high input impedance

adcanalogoperational-amplifier

I'm trying to connect a 0-10V analog input value to my MCU which is running on 3.3V. I used the below circuit but when the input is floating I read about 0.3V on the analog input. I need to eliminate the bias current from the analog input.

schematic

simulate this circuit – Schematic created using CircuitLab

Can anyone give me a hint what is the best practice to connect a 0-10V signal to an MCU?

Best Answer

THe LM358 has an input impedance ~ 10M and output impedance ~ 1k @ 1V/1mA and 100mV /50uA ~2k so it is does not reach negative rail to 0V. But perhaps close enough if your load is < 50uA or >> 2k. That is due to saturation of BJT’s.

Consider matching input R to V+,V- to null input bias current induced Vio with ~1M on each input.

Consider output current with voltage divider R’s with <1V = Vout. Raise the output R’s to reduce Iout. The clamp current from MCU diodes will limit the output voltage without a need for a Zener to << 1mA.

This ought to reduce open cct Vio and Vout .

A better solution might be to use a RRIO CMOS OpAmp with 10M on each input.

Related Topic