Electronic – is a Delta Sigma ADC more accurate than a direct conversion ADC

adc

I'm just trying to understand if my thinking is correct here. A "regular" ADC in my mind would have a number of quantization levels. Say 16 bits gives me 65535 levels, if I divide into my input voltage range I'll get a number in millivolts for each step. So for a 0-1V input range I get 0.015mV per step.

If my input voltage of my signal was only 0.5V I'd be throwing away half of the levels I could be using and losing accuracy I could have had.

Does this same thinking hold true for a 16 bit Delta Sigma, can I still think of it in terms of quantization levels of 0.015mV even though it's being converted into a train of pulses and then decimated?

Would they both have the same level of accuracy? Would the Delta Sigma be more immune to noise and so have a higher accuracy?

I've never used one before so I was reading up on them today.

Best Answer

The number of bits usually is a good indicator for the performance of an ADC. To quantify the true performance other measures like ENOB (effective number of bits) are better.

A delta-sigma ADC with an ENOB of 16 bits is as good as any other converter with that performance. However, delta-sigma converters consist of a noise shaper and a decimation filter. The resolution is determined by the order of the noise shaper and the decimation factor. For some converters it is possible to change the decimation factor and trade off speed for resolution.

Oversampling and averaging can also be used for Nyquist rate converters, but the improvement is usually not so high, since no noise shaping is done.

A direct conversion ADC requires \$2^{n-1}\$ comparators, so their resolution is usually lower than that of delta-sigma converters. A larger number of comparators also means better comparators (since we have a higher resolution), therefore they are much more costly.