Electronic – ADC input impedance on MCUs

What is the input impedance of a typical MCU ADC? In this case I'm working with a PIC24FJ64GA004. I don't need high speed sampling – a maximum of 100 samples per second.

I wish to connect a resistive divider with a 100k resistor and a 10k resistor, so the impedance should be higher than 1M or else the impedance will start skewing the readings.

Input Leakage Current

To determine your resistors voltage drop from the gate you need to use the leakage current from the datasheet. Microchip specifies an "Input Leakage Current" on their datasheets. The [datasheet that I have looked up][1] specifies an input leakage current of 1uA. This could cause a .1V or 100mV, which is only double what Robert calculated, probably not a problem on your signal.

Now remember, if you are dividing a 30V signal down to 30/11 (2.7v) volts full read then the 100mV is added to this, causing up to 3% error on your 30V signal.

If you need a resolution of 1V, divide that by 11 and then add the 100mV. This 100mV could be larger than the 1V signal.

Input Capacitance

Robert is correct, there will be a capacitance, but this really specifies an amount of time that is needed to take the ADC measurement. This also, combined with the input resistance you chose, creates a low pass filter; if you were wanting to measure signals with a higher frequency, you are not going to be able to capture them.

Reducing the error

The easiest way is to either reduce your resistance on your divider, or to buffer your signal. When you buffer the signal you will replace the PIC's leakage current with your op-amps leakage current which you can get quite low.

This 1uA is a worst case, unless it costs you a large amount to make minor changes to the design, fab your design and test how bad it is for you.

Please let me know if there is anything I can do to make this easier to read.