Electronic – Are 4-wire resistance temperature detectors sensitive to ambient temperature

adcrtd

We're using a TI LMP90080 as an ADC to read a 4-wire resistance thermal detector (RTD) in the following circuit:
4-wire RTD

The problem we are having is that when the ambient temperature changes, especially near R88, a 25ppm/°C 0.10% 1k resistor, the reading is negatively effected, i.e. when heated, the temperature read changes about 20 °C negatively, and vice versa when cooled. This seems to make sense but I'm wondering is there a way to fix this without temperature compensation via an ambient temperature IC?

The RTD is connected via traces and some interconnect wires between PCBs and are terminated as such:
termination

The RTD is well away from the heating or cooling used for testing and looks like this:
rtd-wire

Best Answer

How the heck do you figure this "makes sense"? 25ppm/°C is for a 20°C change is about 500ppm, or about 0.05% (and that's the maximum change, typical is usually better.

A standard Pt RTD changes about +0.385%/K so you should not get much more than about 0.1°C change in your reading.

What could be happening if you're using some crude means like a hot air gun to heat the board is that you could be seeing thermocouple voltages, but even that seems unlikely, unless you're doing something particularly silly.

Since it appears that a hot air gun was involved, let's explore that angle. Say a temperature gradient of 60°C is created by the gun (a cheap one will produce air at hundreds of degrees C). If a junction between two dissimilar metals are present will produce 20 or 30uV/K then you could get 1.5mV of voltage. A DIN-standard Pt100 RTD at 1mA will have about 0.38mv/K output, so that could cause 4 degrees C error. The reported 20°C is possible, but they'd have to be total blacksmiths (no offense intended to actual skilled blacksmiths). You can't really get realistic results with that sort of test- a proper thermal chamber with controlled slew rates should be used (and, as I said, you should see only very small shifts). Presumably something is wrong that is causing the client to do these things. If I was a betting person, I'd guess bad wiring, electrical noise (your circuit is not industrially hardened, but it could be okay in a lab), or possibly a mismatched sensor

Use a voltmeter to measure the input voltages wrt ground and make sure they are reasonable, then do the calculation manually to figure out what is going on. Make sure the current through the RTD is high enough that thermocouple voltages won't be a huge effect (don't set the current to 100uA with a Pt100 RTD). It's a trade-off between self-heating and having enough signal to work with.

Something is seriously wrong to be seeing changes like that with such high-end parts-- it needs to be fixed, not compensated.