Electronic – What’s the highest precision achieved for an ADC

adcanalog

I was browsing Digikey the other day (don't you?) and I stumbled across some 32bit ADCs, there were offerings from Linear, TI and Analog. One stood out, the AD7177 from Analog which states in Table 7 on page 19 of the datasheet that at 5 samples per second it's got a staggering 27.5 effective number of bits (and an RMS noise of 50 nanovolts). On the other hand of course, it's accuracy is significantly worse, but still.

This got me wondering, if a relatively cheap off-the-shelf ADC can hit an ENOB of 27.5 bits…

What's the highest ENOB ever achieved? Be it in some super integrated IC, some piece of stupidly expensive lab gear, a lock-in amplifier? Has anyone ever beaten 27.5 bits of precision?

[edit] A bit of clarification I'm not looking to buy/build or otherwise aquire such a device I'm just curious what the current state of the art is, modern atomic clocks have hit 3×10-18 (3 quintillionths) uncertainty, where do modern scientific voltmeters sit on the scale?

Best Answer

Definition from Wiki: -

Effective number of bits (ENOB) is a measure of the dynamic range of an analog-to-digital converter (ADC) and its associated circuitry. The resolution of an ADC is specified by the number of bits used to represent the analog value, in principle giving 2^N signal levels for an N-bit signal

Quote from Atmel: -

In most cases 10-bit resolution is sufficient, but in some cases higher accuracy is desired. Special signal processing techniques can be used to improve the resolution of the measurement. By using a method called 'Oversampling and Decimation' higher resolution might be achieved, without using an external ADC.

enter image description here

Oversampling - take 4 consecutive samples and combine them to get one more bit of resolution; take a fairly standard 18 bit ADC and oversample by 256 to get a 22 bit ADC. Oversample by another 256 times to get a 26 bit ADC...

Do you see where this is going?

If noise is present and causes dithering of the signal, you can make any ADC have one extra bit by averaging/decimating 4 samples so, average as many as you like to get a higher resolution but clearly the price to pay is proportionally lower bandwidth and accuracy.

What's the highest ENOB ever achieved?

What do you want it to be?


Footnote - a sigma delta ADC does exactly what I've described above except, it manages out of band noise much better and therefore gets a better yield on increased bits per converted samples averaged (or decimated).

It only uses a 1 bit ADC (a comparator) so clearly this technique works but it doesn't have to use a 1 bit ADC. It's all about noise filtering: -

enter image description here

The noise from a sigma delta ADC is progressively higher at higher frequencies due to the use of an integrator in the signal path - this forces noise to be low at low frequencies and, after decimation this yields a net benefit in resolution compared to just a conventional ADC that has been over-sampled and decimated.