Electronic – RTD sensors and the thestery of Voltage Vs Current sources

precisionrtdtemperature

Alright folks, I know there are a decent number of threads about RTD sensors (resistance thermometers. most associated with platinum type ones) Let me first assure you then hopefully show you that I've gone through the threads. I've read enough now to know there are multiple ways to do things, but I am struggling to figure out the why.

I'm still stuck on this idea that the classical way to measure an RTD sensor is to use a ratiometric measurement of a RTD resistor and a reference resistor, along with an excitation current provided by a current source.

This is great an all, but I can't really see why it helps us versus a low-voltage voltage source. The ratiometric measurement in the classically described situation is used because it allows any drift in the current source to be tracked by the reference resistor and eliminated as error.

By that same measure, the fact a voltage source would have a drift in voltage and excitation current because of changing resistances, should not matter, since you're running the beauty of a ratiometric measurement.

Other ideas:
(1) Perhaps people like that you can get a clearer linear internal heating error with a current source.
(2) People like the complications of current sources even though the easiest way to make one involves first a voltage reference?
(3) I'm stumped.
(4) Plenty of people say "current source is the classical way!" but then solid chip makers like MAX31865 turn around and still use voltage references not current sources.

Stackexchange questions that were about RTD sensors but only vaguely relevant:
Improving RTD Sensor filtering and readout
–> Decent classic circuit for RTD sensors, but not much conclusive here.
Single current source with parallel loads for many RTD sensors
–> More RTD discussion. Kind of works as another instance of people saying use a voltage source, or saying use a current source, but no clear explanation of why.

Stackexchange questions that were about RTD sensors but not very relevant:
Reading RTD Temperature Sensor
RTD's resistance to temperature

Stackexchange about current source vs voltage source but nothing related to measurement:
voltage source vs. current source

Any clarity would be appreciated. Thanks.

Updates:
Reading more I've seen the general claims "Current sources often are the preferred type of excitation source as they provide better noise immunity."

But it sounds like the good folks here are assuring me [personal interpretations] that the underlying reason why I am unclear on why current source vs voltage source would matter in a ratiometric measurement, is because in a ratiometric measurement it does not really matter.

Best Answer

Actually, the chip you mention has something close to a current source- because they're measuring the voltage across the reference resistor they don't care if it changes a bit. Since the resistor (connected to the 2V bias) is 4x the 100°C value of the RTD and the RTD only changes by 30-40% for a +/-100°C range, the current is constant within about 10% at a (very high) 4mA for Pt100 and 0.4mA for a Pt1000. That is a much higher current than typically used in precision applications, so self heating is a definite source of error.

Let's take an example- you want to measure temperature from 0 to 100°C and you have a 0.8mA current source. Let's assume it's a Pt100 DIN curve (\$\alpha = 0.00385\$).

The voltage across the RTD will be 80mV at 0°C and 110.8mV at 100°C, for a span of 30.8mV. So a 0.1°C error (say that's our allowable error due to the electronics) would represent a 30.8uV voltage, and it would be 0.027% of full scale.

If you offset the 100 ohm base resistance of the sensor with a stable resistor (it's easy to get a resistor that is much more stable than a voltage or current source), and if we assume that error is relatively negligible, then we still have the 30.8uV error budget, but now we only have to have an accuracy of 0.1% in our measurement (almost 4 times better). A good ADC can be comparable to a precision resistor divider in ratiometric measurements, and that's what the MAX chip is depending on- also they're not shooting for the best possible accuracy, just something viable.

If you were thinking about using a circuit with, say, a 160mV voltage source and a series 100 ohm resistor to measure the current, you'd have substantially changing current through the sensor (so you'd get less resolution in degrees at high temperatures for a given resolution or noise floor), and the self-heating would greatly increase at low temperatures rather than appearing as a (relatively) fixed offset temperature. A high voltage V with a large series resistance R behaves the same as an imperfect current source of I = V/R with an output impedance equal to that R (Thevenin).