Electronic – Trying to understand prescaling voltage for ADC with least error

accuracyadcvoltage measurement

I'm teaching myself EE while designing a 12V DC battery monitor. I'm using a pair of Hall-effect sensors to track charge & load current and looking for the best way to measure the voltage. [An ATmega32u4 will take successive reads with the ADC, compute a 1-second average, and pass that up over USB for logging and analysis.]

The issue of error with ADC really has me spooked. It is essential that I produce meaningful data. There's so many error sources—both in my circuit and in the ADC—that it feels like playing wack-a-mole in the dark. I'm reading lots about calibrating the ADC's reference, but the first step is to pick a method for scaling the input.

My voltage window is 10-15V, specifically a range from 10.5V to 14.4V (3.9V). I came up with two approaches that (hopefully) do what I want:

  1. A 10V Zener diode on the inverting input of a differential op-amp. Rejecting 10V, my range becomes 0.5V to 4.4V (3.9V) and uses nearly the whole range of my ADC.
  2. A 20k : 10k voltage divider. Scaled down 3x, my range becomes 3.5V to 4.8V (1.3V)

I wanted to prove to myself which method would render the best granularity. Since I can also obtain 12-bit accuracy by adding four reads and shifting the result, I compared four possibilities: (assuming ADC ref is 5.0V)

  • method 1, 0.5-4.4V @10-bit is 800 steps: 4.88mV/step [#3]
  • method 1, 0.5-4.4V @12-bit is 3606 steps: 1.22mV/step [#1]
  • method 2, 3.5-4.8V @10-bit is 266 steps: 14.65mV/step [#4]
  • method 2, 3.5-4.8V @12-bit is 1065 steps: 3.66mV/step [#2]

My question is, do my mV/step figures tell the whole story, or is there some kind of downside to #2 that I'm not seeing?

Best Answer

The offset with the zener is of little use. The best zeners have a 1 % tolerance, which is 100 mV for a 10 V zener. On a 10-bit ADC this is a 20 count error, on a 12-bit ADC an 82 count error. You could trim the error away if you can measure the voltage accurately enough, but there are other factors. The BZX84-A10 has an 8 mV/°C temperature coefficient, giving a 2 count error per °C change in temperature for the 10-bit ADC, and 7 counts for the 12-bit. Looks like it's better suited as thermometer than as voltage meter. When you use a 10 V zener you'll also need a higher voltage power supply.

The resistance divider will do much better. Resistors also have a tolerance, but at 25 ¢ a 0.1 % resistor is still affordable. (Better that 0.1 % becomes expensive quickly: a 0.05 % costs almost 1 dollar.) At 10-bit resolution that will give you a 1 count error, 4 counts at 12-bit. Temperature coefficient will be less of a problem if both resistors are from the same series and placed close to each other: since the divider is ratiometric resistance changes will cancel each other out.

The numbers indicate that a higher than 10-bit resolution is of little use; component tolerances and variations will cause extra bit to be unreliable. A few extra bits may help to increase noise immunity, though, by averaging a series of measurements, or using a sigma-delta ADC, which averages the input signal anyway.

There's also something more philosophical: we always want better, but why on earth would you want to know a 12 V battery's voltage to a precision of better than 10 mV. You'll have a hard time getting the required resolution, and you'll always be uncertain about that last digit.

The ADS1000 is a low cost 12-bit ADC which will operate from a single 5 V supply.