Electronic – Voltage reference sampled by ADC (12 bit) gives value that is off by more points than other measurements

adcmeasurementreferencevoltage

I am using the TI REF3020 voltage reference (2048mV) in combination with a microcontroller in order to have a reference for voltage measurements.

The microcontroller is powered by 5V (5.00V), the MCUs ADC Reference is set to 5.00V too. Due to the fact that I don't want to rely on the VDD rails being referentially stable, I wanted to add a precision voltage reference.

In my test setups, I can easily measure the VDD (ADC reference) voltage and calculate the voltage to ground of an ADC input pin. For testing purposes, I used an external power supply as an input for an ADC pin. After measuring the voltage at the pin with an oscilloscope and a DMM, the MCU measured, without any further calibration, the actual correct voltage down to the LSB of the ADC. (I am using a 12-bit ADC) This experiment was successfully carried out with multiple voltages between 0V and 5V.

I further used the aforementioned REF3020 voltage reference in the same setup as an input for an ADC. Both oscilloscope and DMM confirmed precisely 2.048V. Measuring the same voltage with the MCU via an ADC, the value was off by 15 points.
Instead of a bit-value of 1677, the MCU measured 1663:$$U=\frac{1663}{4095}\cdot5V\approx2.0305V$$

The value measured by the MCU does not oscillate nor drift over time and the (to be honest, relatively narrow window of) temperature I tested. The (averaged) value constantly stays at 1663. Without averaging, the value ranges from a minimum of about 1660 to 1666. The influence of temperature should be negligible though, because the other experiment was conducted at the same temperature (room temperature).

For this application I am using an average of 128 samples in a time window of about 4ms. I even tried to make the sample and compare times longer. Did not help either.

The voltage reference is even a high-current one. It can safely supply up to 25mA according to the datasheet. This should by far be enough to be sampled by a single ADC channel. There are no other loads connected to the voltage reference.

I could easily calibrate the offset error, but this would in essence defeat the whole purpose of the voltage reference in this application.

What else could be the cause for this deviation that manifests only when measuring the voltage reference's output voltage?

Best Answer

Try providing a little more settling time (a few msec) before taking each measurement. By this I mean time between when the ADC channel is selected and when the conversion starts. I use the REF3020 and it has poor response to changes in load, even though it can provide high current into a fixed load. You won't see it with a DMM or even a scope because it's hard to set the trigger set very fine.