Electronic – Accurate yet relatively simple way to use RTD probes

rtd

I would like to build a reference thermometer for checking and maybe calibrating other sensors and I was planning to use 1/10 DIN RTD probe in order to do that. You can get such probes for ~$100 and the error is ≤±0.07C in -50°C – 50°C range, which sounds great. I'm not sure, however, if I can use it efficiently without deep electronics knowledge or spending a few hundred dollars on a reference thermometer like Fluke 1523.

The easiest (and pretty cheap) way to use RTDs is MAX31865 (IC to automatically convert RTD resistance to temperature readings), but the only note about accuracy that I can find on it is this line from the datasheet "Total Accuracy Over All Operating Conditions: 0.5°C (0.05% of Full Scale) max". 0.5°C full scale doesn't sound bad, but I'm not really sure what that means for lower temperature ranges, for example: how much the error of the RTD will increase in the 0-100°C range when using MAX31865. Without that knowledge I also don't know how much do I gain by using 1/10DIN instead of Class B probe.

So my general question is: what's the easiest way to accurately read precision RTD probes with a limited budget (like $50-$100 not including the probe)?

Best Answer

The MAX31865 seems like a very good choice. The datasheet does give more detailed accuracy specifications—the first page of a datasheet is generally just marketing material.

On page 3, we have some specifications of the ADC: full-scale error typically ±1 LSB, integral nonlinearity typically ±1 LSB, and offset error at most ±3 LSB. Therefore, the output of the ADC will typically be within 4 least-significant bits of the correct value. Since it's a 15-bit ADC, that's an error of \$\frac{4}{2^{15}}\$ or about 0.013%. Since the resistance of an RTD is roughly linear with temperature (at 2.73 K/Ω for a PT100 RTD), an error of 0.013% at 273 K corresponds to a temperature error of 0.013% * 273 K = 0.036 K. At a higher temperature, the absolute error would be proportionately larger.

We also have some graphs of accuracy in the datasheet, at the bottom page 6. These give a more detailed picture, and we can see that our previous estimate of a typical error of 0.036 K is not too far off, though the absolute error does not seem to grow linearly with resistance.

Overall, the 0.5 K worst-case error from page 1 of the datasheet seems very conservative. The actual error is likely to be no more than the error inherent to the RTD, if you do indeed go with a 1/10 DIN RTD, which has an error of at most ±0.07 K from -60 to 50 degrees Celsius.

If you follow the datasheet recommendations, though, you'll introduce another source of error: self-heating. The datasheet suggests a 400 Ω reference resistor for a PT100, which results in 4 mA through the RTD (due to the 2 V bias generated). This is about an order of magnitude above the recommended current for a PT100, so depending on what RTD probe you pick, you may want to use a reference resistor of about 5 kΩ instead.

The advantage of a smaller reference resistor (and therefore a higher current is better noise-immunity) so there is a tradeoff. For use as a reference thermometer, you can afford to average the noise out over a very long time, so a large resistor makes sense. In an industrial application, you might have a lot of noise, but also a very large measurement sample that can absorb the generated heat effectively, so there the smaller resistor would make sense.

Interestingly, regardless of the choice for reference resistor, the tolerance of it is very important, as it directly contributes to the final error. Using an 0.1% resistor means that you can never do better than 0.1% error (0.273 K error at 0 degrees Celsius), so you may want to splurge on an 0.01% resistor.

Related Topic