Electronic – Would reducing the reference voltage of an ADC have any effect on accuracy

accuracyadcresolution

Regarding the following information taken from a paper:

enter image description here

The paragraph is telling about what happens when the ADC Vref is reduced.

There is the statement from the above quote:

Note that if you reduce the reference voltage to 0.8V, the LSB would
then represent 100mV, allowing you to measure a smaller range of
voltages (0 to 0.8V) with greater accuracy.

Isn't it wrong to say "with greater accuracy"? Shouldn't it be "with greater precision" instead?

(I'm asking because if I don't clarify this point, I will be misunderstanding all the rest)

Best Answer

I think precision means more numbers, like: 1.23 V vs 1.2300 V, the latter has more precision. However, that says nothing about the true value of the voltage. It is possible that my inaccurate meter says 1.2300 V while the actual voltage is 1.220000 V

More accuracy means that the value I get is closer to the real value. So my accurate meter would show: 1.221 V while the actual value is 1.220000 V.

So 1.221 V has greater accuracy (but less precision)

while

1.2300 V has greater precision (but less accuracy).

In the ADC example the amount of numbers (different reading) stays the same: 8 readings. So precision remains unaffected whatever the reference voltage is. Accuracy does increase though as the reference voltage is decreased because the LSB intervals become smaller as Vref decreases. That means that the value of the error between the actually measured voltage and the value which the ADC outputs (quantization error) will become smaller.

Also: instead "precision" engineers more often use "resolution".