Electronic – Calibrating the output voltage

calibrationcomparatordac

I have a device which has a 16 bit D/A converter which through an analog linear circuit generates an output voltage between -1V and 1V. I have 32 of such channels on my device.

schematic

simulate this circuit – Schematic created using CircuitLab

I need to design a circuit to be used for factory calibration. I need to find the gain through the analog part of the circuit so that I know which digital value val to write to the D/A converter to achieve a DC voltage output of, say Vx=800mV.

I need to calibrate it well enough that I can achieve an output voltage of accuracy better than 0.5mV, preferably 0.25mV. The calibration does not have to be very fast. Anything up to two minutes will do.

I have been thinking about just using a D/A converter. Another way could be to use a comparator and a precision voltage reference and using bisection to determine the gain, but it seems to be hard to find one that has an input offset voltage that is low enough.

How would you go about going this?

Conclusion:

I am going to use two 16 channel ADC's and a LTC655BHMS8-1.25 voltage reference which has +-0.025% accuracy at 2 ppm/C. The ADCs will be LTC2439-1IGN which are slow but accurate. Using a small microcontroller I will control the ADCs and use averaging to get rid of noise.

I think this should meet my requirements without breaking my budget.

Best Answer

I think you will find chopper stabilised amplifiers with low enough offsets.

And if you can guarantee linearity in the analog stage better than 0.025% (1 part in 4000) you can calibrate at 0, +1V, -1V and the rest will follow.

I think the tricky bits will be :

  • sourcing exactly 1V within 0.25mv and keeping it there across temperature.
  • verifying that all sources of error combined do not exceed your error budget.