Electronic – Unexpectedly poor DNL, INL and ENOB on an ADC

adcmicrocontroller

I'm currently trying to characterize an MCU and it's peripherals, in particular the on-board ADC. However, running the usual test procedures (i.e. sine wave for ENOB and sawtooth for DNL and INL), I find that the results are rather poor. Long story short, the ADC "is" 12 bits – meaning one would expect around 10 bits of resolution out of it. However, no matter what I try, ENOB seems to plateau around 8.5 bits. Similarly with the DNL and INL – the best results I've been getting are when the signal is oversampled 64 times. The datasheet on page 44 does say that one may expect DNL of around ±3 LSB – however, this is what I get when I oversample. The raw signal peaks at about 7 LSB. There are a few things that I tried so far:

  • Putting a 10 nF capacitor across ADC input and ground (which does work to some extent – without it ENOB is 7.5)
  • Oversampling 4x, 8x and a few others up to 256x. This – as one might expect – also improves results, although not as significantly. Larger oversampling produces better results whereas 4x is almost identical to no oversampling.
  • Slowing down the ADC module clock and increasing sampling window. None of these were particularly effective. Increasing the sampling window had a small positive impact on ENOB and DNL. Slowing the clock down didn't really have any effect.
  • Similarly, using external or internal references did not seem to produce an effect, although I haven't done a lot of experiments with external reference – maybe that's the next one to consider. A small edit for the reference of whoever may read it one day – using an external reference in my case yielded worse results. In other words, the internal reference on my MCU seems to be more accurate – or external reference isn't as good as I expected.

Now, the question is at which point does one accept that the ADC is just of poor quality or somehow damaged? What other tests can I try to improve ADC characteristics. Note that I generate the input signals with the function generator. At this point I am just stuck – i.e. ran out of forum threads to read – and I would genuinely appreciate any advice.

Best Answer

Personally, I am not a very big fan of rules of thumb, but there are a few rules that are just exceptionally accurate that even I make an exception for.

One of those rules is as follows:

Microcontroller ADC peripherals are always terrible.

Always. Not the kind of terrible you just derate. The kind of terrible that leaves you crying in the fetal position in the shower.

I kid, but only slightly. To answer your question, yes, the ADC is actually just that bad, and yes, you should accept it. Note the big disclaimer stating "Operating Conditions Apply." in the datasheet.

You have to remember that for certain parts (like MCUs), the datasheet is also marketing material, and MCUs generally compete in peripherals, or power consumption (or both). So figures for things like ADCs will often be technically correct, as long as you use the ADC under the same ridiculous and impractical circumstances they took the measurement for the datasheet. A favorite I see often is measuring the ENOB with all other peripherals as well as the actual processor core completely powered down in some deep sleep. Another is using significant software post-processing and similar techniques like oversampling to get the measurement. I don't think I've ever seen an MCU datasheet that didn't spec the ADC in terms of 'this is the best you will ever possibly achieve, and we may or may not even tell you how we managed it' rather than 'this is the spec before you try to clean up the signal in software'.

And, just to be clear, it sounds like this part has a pretty good ADC, at least compared to all the other terrible MCU ADCs. 8.5 bits? Such opulence! So bourgeois! Most of the time you'll get maybe 6 bits from a 10 bit ADC in an MCU. Remember, when they say 12-bit resolution, they mean there are 12 bits you can read from some register somewhere. There is no implication that they won't be 12 bits of useless noise. The only promise is that there will be 12 bits of something.

Now, at this point, you might be a little skeptical that these analog peripherals would invariably be this terrible, or at the very least, wonder why some chip company doesn't simply release an MCU with a half-way decent analog front end.

Well, they can't. It doesn't matter how good the reference is really, it's not a stability issue. It's noise. And physics.

It is simply physically impossible to create a high performance (or really, just mediocre performance) ADC on the same silicon die as an MCU. And it only takes one gotchya to screw-up analog performance. In this case, there is not one gotchya, but several.

First, just one CMOS transistor switching will dump all kinds of harmonics and noise directly into, well, everything, and will couple into (you guessed it) everything, when it switches. We tend to think of CMOS as being low power (and it is), but it is worth keeping in mind that the power CMOS does use is effectively zero - except when a transistor is changing state. And they change state very fast, on the order of tens to hundreds of picoseconds. When you take a dozen milliamps being consumed by something that consumes essentially no static power, and all that current is being turned entirely into intense, 100ps current spikes of literally millions of little tiny bastards switches, switching in aggregate... well, that should reframe things a bit. Those few milliamps are much more sinister than they seem, at least for analog stuff. Low power ≠ low noise. CMOS is low power because it only needs power to switch. But it switches harder than a dubstep bass drop.

Those spikes all must rip through the substrate, the substrate that the ADC shares, and resistive enough to cause localized ground bounce in the substrate that is meaningless to digital circuits, but very troublesome indeed to any analog circuitry.

And there is really no way around that. That's just one problem. The other is that it is actually physically impossible to create a high performance analog layout that can coexist with the MCU, which is going to need to use those pins as GPIO as well, and other considerations that fatally disrupt any chance of a good analog section layout.

Now, there are a few specialized MCUs with somewhat improved ADCs that achieve this by actually having two entirely separate silicon dies in one package, connected by bond wires, thus giving substrate isolation. You're going to pay for this feature though, and the results are still going to be poorer than a dedicated ADC due to sheer proximity.

Oh, and I haven't even touched on how this all assumes you have a flawless external layout and grounding and decoupling situation in all ways relating to your analog and digital section. That alone is nontrivial, just ask Henry Ott.

So, in conclusion, I'm afraid that the ADC on your chip is really just that terrible. Exactly like every other MCU ADC. Sorry. Either it's good enough - and for many many applications (aided by some fairly clever software sourcery - pun!) - it is. It's amazing what one really can get done even with the poor analog performance available as long as you're clever. But clever can only carry you so far. if you want cold, hard effective number of bits, you really just have to bite the bullet and use a dedicated ADC along with careful PCB layout and decoupling, or look into using a more specialized part (like multi-die packages).