Electronic – ADC/DAC Calibration

adc

What are the methods for Calibrating an ADC/ADC ? Is ADC/DAC calibration is done purely in software ? May I know some hardware methods for ADC/DAC calibration.

Best Answer

There was a similar question some time ago about software calibration

To repeat in short:

  • Have a known voltage at the input of your circuit and write down the ADC reading.
  • It is a good idea to have the known voltage not directly at the ADC pin. Have it at the point you want to measure in the final application, e.g. directly at the sensor output.
  • A known voltage can be applied by an external source already be your sensor output or whatever, which is then measured precisely. precisely means: More precise than the ADC resolution, or at least as precise as you need it in your application
  • Note: I use voltage here. It can also be a current to be measured or what ever you are measuring with your sensor.
  • Find a function to translate ADC reading to voltage.
    • The simplest case is a proportional function V(ADC) = m * ADC which only takes into account e.g. an inaccurate gain factor due to part spread. For example, using 1% resistors for a voltage divider may result in a 2% deviation. For this calibration, you only need a measurement near the ADCs maximum.
    • One further step is to consider an offset, i.e. 0V is not 0x00. Make an additional measurement at the lower end of the input range and find two parameters to fit the data with V(ADC) = m * ADC + b .
    • The highest precision can be achieved by measuring lots of voltages, in best case, you will have one voltage for each possible ADC reading. Inspect the data and decide which function to use. In my case in the question linked above, I still took a linear function (with offset) fitted to the entire range just because it's fast to calculate. One can add a non-linear term to get rid of the nonlinearity as much as possible. But usually, this consumes more computing power which may be contradicted for fast data processing.
    • As @alex.forencich mentioned, the supreme way would be a lookup table for each ADC value, which can eliminate every distortion, including steps in the response function of the ADC. But this consumes lots of memory and presumes that you can and need to measure with 1-bit accuracy.
  • All points apply for a DAC, too, but in opposite direction.

About hardware calibration: The possibilities are endless.

For example, you can easily calibrate a gain deviation by this:

schematic

simulate this circuit – Schematic created using CircuitLab However, this reduces the input impedance in this case to ~10kOhm. If you choose higher resistor values, they may interfere with the impedance of the ADC itself and cause a non-linearity. For calibration, have the known voltage again and read out the ADC. Turn the potentiometer until the ADC value is what you desire. If there's more electronics between ADC and measurement point, you may add devices of variable value, too. But this heavily depends on your circuit.

(Hmmm, got not-so-short...)