Electronic – Resolution gain in Delta-Sigma Converters

adcresolutionsignal

Suppose I have a two bit quantizer with the following voltage levels: [0 0.2 0.4 0.6]. Thus 0V corresponds to bits 00 and 0.6 to bits 11. Suppose I use this quantizer in a first order delta sigma loop with a simple integrator y[n] = x[n] + y[n-1], if I use a DC input signal, say 0.25V, I will hit codes 0.2V and 0.4V with certain frequency. Hence after averaging this delta sigma ADC would work better than normal averaging ADC in terms of resolution.
In normal averaging ADC, the quantizer would only hit the code 0.2 V and averaging the values will have no effect on the resolution of the ADC.
Can we know how much resolution improvement is done by the SD ADC in this case? I am a little new to ADC's so I might have missed certain information, but all I really want to know is how much better the SD ADC works as compared to normal averaging ADC for a DC input signal in terms of resolution.
EDIT: Can someone explain how would we calculate the resolution of SD ADC in this case in time domain?

Best Answer

enter image description here

This picture here taken from analog devices app note MT-022 show why the delta sigma is better than just averaging and its all due to the Delta Sigma modulation that shapes the quantization noise in such fashion that is taken away from the band of interest. If the quantization noise goes down it means you can be more certain that your interpolation can be more granular as shown in figure C. while just filtering and decimation spread. So it all comes down to noise shaping provided by the Delta sigma modulation