Electronic – Trouble understanding why multimeter gives different resistance readings

multimeterohmmeterresistance

I want to preface this by I am a hobby electronics guy with some experience from reading The Art of Electronics. I know how to use a multimeter but I am very confused at what I am seeing.

I am using a marginally more expensive Circuit Specialists multimeter and testing the resistance across a set of fuel injectors I am working on. My procedure to test is as follows:

  1. Set multimeter to Ohms (200)
  2. Measure resistance across the male side plugs

I was looking for a reference range of 10-14 ohms and I got 15.3 ohms. Concerning. So out of curiosity I measured again:

  1. Set the mulimeter to Kilohms (2k ohms)
  2. Measure the resistance across the male side plugs.

However this time I got 0.012! So 12 ohms, right in the middle of the range.

It may be as simple as my multimeter just isn't very high quality but I was hoping there was a better explanation. Having one measurement outside spec, and one inside spec really bothers me.

EDIT:

My multimeter model is #MT-5211

https://www.circuitspecialists.com/3-1-2-digital-lcr-multimeter.html

Best Answer

Since no one in the comments posted an answer I will post it here.

First:

My multimeter is not the best multimeter. According to the manual I need to short circuit the probes and record the result for the 200 ohm setting. After taking a measurement I subtract this from the observed value to get the answer.

Second:

The specifications had temperature in a little box next to the resistance tolerances I looked over. Since resistance and temperature are related this was important.

By doing the short circuit and then letting the fuel injectors cool down to ~70 degrees I was able to achieve a reading that was within specification.

I appreciate everyone's help!