Electronic – Why do measurements displayed in multimeter inconsistently vary by factors of 1000 depending on range

measurementmultimeterresistance

For my science fair project, I am measuring the resistance of a 100 foot piece of 32 gauge Nichrome wire. Using a Innova 3300 digital multimeter, I get the following results for each resistance range value:

  • 200 ohms: 1 (to far left, indicating over range)
  • 2000 ohms: 1025-1030
  • 20k ohms: 1.02-1.03
  • 200k ohms: 01.0
  • 20M ohms: 0.00

I understand that precision varies based on the range value, so the right-most digits makes sense to me. However, I don't understand the scale of the left-most digits? Why aren't they all in the 1000 range if actual resistance doesn't change.

I have read the manual several times but it just says:

  1. Plug the RED test lead into the "Ω" jack of the multimeter; plug the BLACK test lead into the "COM" jack.
  2. Set the meter’s Function/Range Selector Switch to the OHM "Ω" range function.
  3. Place the RED test lead onto one side of the item being tested and the BLACK test lead onto the other side of the item. (Polarity does
    not matter when checking resistance).
  4. Read the results on the display

There is no mention of what the results mean. After thinking about this with my dad and searching the internet, I think I have an answer: The units of the display value are in the units of the range. For example, 200 and 2000 are in ohms, 20k and 200k are in kilo-ohms, and 20M is in mega-ohms.

Is my hypothesis correct? And what about the leading zero in the 200k range?

Best Answer

The relevant part in the manual is enter image description here

200 Ohm range can show 000.0 - 199.9 Ohms
2K Ohm range can show 0000 - 1999 Ohms
20K Ohm range can show 00.00 - 19.99K Ohms
200K Ohm range can show 000.0 - 199.9K Ohms
20M Ohm range can show 00.00 - 19.99M Ohms

Regarding your results, they should be interpreted as:

2000 ohms: 1025-1030 Ohm
20k ohms: 01.02K - 01.03K Ohm
200k ohms: 001.0K Ohm
20M ohms: 00.00M Ohm