How to an ADC measure below its resolution

adcconverterresolution

An ADC's resolution is set by the maximum input voltage divided by 2b where b is the number of bits. In that way, a 16-bit ADC with 20V input can measure a voltage of 20/65536 = 0.3mV (corresponding to the ADC level 0000000000000001). However, I know an ADC can be used to measure voltages down to the ADC's own quantisation noise, which is typically a number more like 5uV.

I guess you could measure a very noisy signal with mean value lower than 0.3mV, as long as you averaged many measurements. Is this true? Does the noise have to be large enough to tip the signal above 0.3mV (in this case) so that the ADC can register a non-zero signal? Does this relate to dithering?

Best Answer

A simple example would be a regular voltage comparator, which will output 0 if below 0.5V and 1 if equal or above 0.5V. You can view it as a 1-bit ADC working in a range 0-1V.

Now consider a perfect input signal of 0.5V exactly. By our definition it will give 1 in the output. Now, we introduce a low amplitude white noise averaged at zero added to the signal. In this case the output will "jump" between 1 to 0 with probability of 50% at each point of time. If we sample this output over the time and then compute the average, we will eventually get a value 0.5 which is out of the original resolution we had with a single sample.

...And dithering is a method of introducing some artificial noise if the natural one is insufficient.