Electronic – Is a buffer needed to measure the output of a power supply using an ADC

adc

I know from searching that it is good practice to buffer the input of an ADC input signal with say a unity gain Op-Amp, but the discussion is always based on the input being some sort of sensor that has low current/high output impedance and the buffer is used to match the sensor output with the load of the ADC.

Is a buffer still needed/recommended if the input to the ADC is a strong signal like the output of a system DC Power Supply running through a voltage divider to adjust the voltage down to a range the ADC can work with?

For example, my product has a 24VDC 3Amp switcher supply that powers the entire system, including the micro-controller. I want to to monitor the Power Supply voltage level by feeding it into the ADC of the micro. I will use a voltage divider to get the max voltage down to 3V but is it still a good idea to feed that voltage to op-amp to buffer it?

Thanks for any and all advice.


Thanks for everyone's feedback. The ADC will be part of either an STM32 M3 or NXP Coldfire microcontroller. Due to the voltage divider I will simply play it safe and buffer it.

Best Answer

The buffer is used to convert the impedance from the source into a low value so the input impedance of the ADC itself does not significantly affect the voltage measured.

Since you are going to divide the voltage down to something usable by the micro via a voltage divider, and since you will want to use large resistors for that divider to limit the power losses, you end up with a high impedance source. As such you need to buffer it after the division before presenting the value to the ADC.