Electronic – arduino – Measuring temperature with ±0.01°C accuracy

arduinolow-powerprecisiontemperaturevoltage-regulator

What is the most accurate way to measure temperature to ±0.01 °C? I have looked into using a Wheatstone bridge (with a minipot for minor calibrations) and an RTD for its precision and range. I need a range of -85 °C to 55 °C. Ideally this would be a low voltage operation (6 VDC). The output needs to be a digital signal and currently will be sent to Arduino, however in the future I would like to include a datalogging system alongside this device before connecting to Arduino. Powersource is also from the Arduino so stability is currently dependent on the hardware of the Arduino, however the unit will be plugged into a 115 V outlet so a ground reference can be used.

The ultimate goal is have multiple temperature units like this logging data and sending to a mC that can graph the data. I've found various platinum RTDs that are precise enough to measure but I want to know how I will need to lay the circuit out, how to convert the analog signal to digital accurately and any voltage stabilizers that will be necessary for the power supply.

one of the RTDs I've been looking at

Best Answer

Realistically it's very difficult to measure to that system level of accuracy. The particular sensor you show is DIN class A tolerance, meaning that the maximum error of the sensor alone is 150mK + 2mK*|T| (with T in degrees C). So at 100 degrees C, the maximum sensor error alone (not counting self heating) is 350mK, 35 times what you say you want. This type of relatively low-cost sensor is also prone to hysteresis errors due to the thin film construction. That comes into play if there are wide temperature variations- but even to 200°C you can see many tens of mK in error (not shown on your datasheet).

Even at the reference temperature of 0°C, the sensor alone contributes 15x the error you say you want. Self heating will contribute more, depending on the current you pick, and even the best designed measurement circuitry will contribute some error. If you perform calibration you can reduce some of the errors, but that's expensive and difficult and you have to have instrumentation capable of mK accuracy and stability. A single point calibration at the triple point of water is easier but still not easy.

0.01°C stability over a relatively narrow range is not terribly difficult- but requires good design techniques. If you use 200uA energization, you need stability much better than 40uV at the input. Your reference must also be stable to within 20-30ppm over the whole operating temperature range (which will need to be defined). If you use a precise metal foil reference resistor and ratiometric measurement, voltage reference errors can be minimized.

0.01°C resolution is pretty easy. Just hang a 24-bit ADC on the sensor signal conditioning, but it may not mean much (besides showing short-term trends in a benign instrumentation environment) unless all the other things are done right.