Electronic – how to calibrate the multimeter temperature probe

calibrationmultimetertemperature

I have a multimeter with the following temperature accuracy for its Type K probe

±3% of reading ±5C

The display is in steps of 1C

I attempted some calibration:

  • In ice water, the reading is 0C
  • In boiling water, the reading is 94C (1012.9mbar/100C)

Can I interpolate that? Would the reading at 20C be 0.2 x 94C = 18.8 C, i.e. "19" or maybe even "18"? Or are there other factors at play?

I would expect the readings to be monotonous against the actual values, but perhaps the ADC's step size and non-linearity, in combination with the display's granularity, make readings not strictly monotonous, leaving scattered "pleateaus" of readings.

Of course, one could mix the 0C water and 100C at various ratios to create a actual vs. reading chart, but not only is this tedious, I am sure also fraught with systematic and random errors. So I'd rather go with the wisdom of the group here.

The model is "DMR-6500", but perhaps the validity of your reasoning will be independent from the model/brand.

Best Answer

There's a decent chance your multimeter does not bother to linearize the thermocouple curve. The uV/°C is less between 0°C and 100°C (40.96uV/K) than between 500 and 600°C (42.61uV/K) so you'd expect some sag at the 100°C point if they just average out the errors over the 760°C range.

Most older design multimeters are very linear (can't speak for yours), but some newer ones use different ADC methods and are not impressive. So if it's very linear, 'calibrating' it may work, but only if the ambient temperature (not the temperature being measured) is very stable and close to the calibration conditions. The cold-junction compensation of the multimeter is probably not too impressive. The internal temperature sensor error contributes approximately a 1:1 error to the reading.

Frankly, in the 0..100°C range you'd probably be better off buying a precision NTC thermistor or 1K RTD and measuring the resistance on the DMM.