Electronic – How to test the multimeter’s battery without a low battery indicator

multimeter

I a few months ago needed a multimeter, and I didn't have one. Not wanting to wait for one to arrive, and only needing it once, I went out to the hardware store and bought a cheap digital multimeter. I bought the cheapest one I could find that was digital, and I guess that's why you can't find it online anywhere. :P No problem getting a low accuracy and build quality (if this one breaks, I'm going to get a newer, better one).

I ended up using it more than I imagined, and it has enough accuracy for my needs (everything that I use is less than +-1.2%), and the only negative is it doesn't have a low battery indicator. If I knew I would be using it some, I would have bought a better one, but this one is sufficient for testing basic voltages, currents, and resistances to easily find the value of a resistor without reading the color codes. Is there a way that I can check the battery without opening it up and using a battery tester? I don't want to mess around with it too much, but I don't have a real need to upgrade besides this, which the money could be better spent on other parts that I want.

Best Answer

It's easy to to test your accuracy of the multimeter with a simple resistor.

You don't need anything else besides your multimeter. This method doesn't actually test battery; it tests accuracy of multimeter, which decreases with the battery level.

Let's do some math:

Before we can start, you'll need to pick out a resistor to use. To get the best possible accuracy check of your meter, you'll want as small tolerance you can afford (I'm using a 5% just because the accuracy doesn't matter that much to me). If you can afford/have time to get/want a smaller tolerance, and you can justify the extra cost, go ahead. It will work perfectly fine, in fact, a little better. However, since you/I have a cheap multimeter to begin with that I don't care about accuracy, I wouldn't bother with the cost. If you are testing it for some reason and comparing its accuracy to others, may want to get a better one for that purpose, but that's off topic. For the resistance, remember the lower the resistance, the better accuracy. I'll explain why a little later. That being said, if you have a resistor that is 5 milliohms (if there is such a thing): a.) if you have enough money to keep that kind of resistor on stock or you're doing detailed electronics that need that little resistance, WHY are you not upgrading (or at least shipping one to me=P), and b.) will a multimeter that cheap really measure such a little voltage?


If you have two resistors of different resistors and tolerances, and don't know which to pick:

I have a 33 OHM 5% Tolerance RadioShack resistor and a cheap 150 OHM 2% Tolerance resistor. Which do I choose? There's a simple equation to tell: $$Accuracy = Resistance*Tolerance$$ The resistor with the least accuracy number (weird... I know, right?) is the best choice. The RadioShack being 165 and the other one being 300, I chose the RadioShack.


Back to the math:

You'll need to know the accuracy percentage of your multimeter for the resistance range you'll use. I will be using the 200 OHM range with my 33 OHM resistor (yes, my multimeter is a manual ranging with no battery indicator and cheap construction: don't judge). Mine is 1% +-2D. Time to plug everything in (if your multimeter doesn't have a digits accuracy ("D"), ignore the variable "A").

$$TotalTolerancePercent=(ResistorTolerance/100+1)*(MultimeterTolerancePercentage/100+1)$$


Also you need to calculate:

$$A=MultimeterDigetsAccuracy*LowestValue$$

(The "Lowest_Value" is what the lowest number your display can display besides 0 on your rance. EX: 00.1)


The final equations that you need to plug in TotalTolerancePercent and A:

$$UpperRange=TotalTolerancePercent*(ResistorResistance+A)$$ and... $$LowerRange=(ResistorResistance-UpperRange)+ResistorResistance$$


Example:

$$TotalTolerancePercent=(5/100+1)*(1/100+1) = 1.0605$$ $$A=2*0.1 = 0.2$$ $$UpperRange=1.0605*(33+0.2) = 35.01771$$ $$LowerRange=-2.01771+33 = 30.98229$$


You're done with math. YAY!!! What we just calculated was what the resistance should read between on multimeter by calculating the accuracy of the multimeter and the accuracy of the resistor and combining them in what looks like a very complex way (but it isn't; I've been reciting it off the top of my head to make sure I didn't forget before wrote this). Pull out your multimeter and measure the resistor. I held it on there until my measurement stabilized and it stopped changing. If it's in the ranges: your battery's is perfect or/and achieving the theoretical results, and if its not, don't worry. You can try replacing it, but if it's old, it can get uncalibrated. Since it will be a cheap multimeter, you can't calibrate it, but upgrading might be a better choice if a new battery doesn't fix it and you want more accuracy. I would have replaced my battery if it read 10 ohms, but 30 probably is close enough for me. Enjoy!!!

Since this took a long time to write, I probably messed something up in the equations. Feel free to edit it to clarify something or to comment. One thing to remember: Calculate the part inside the parenthesis before calculating the equation outside of the parenthesis.


Edit: I have some pictures:

Multimeter showing 33.0 OHMS That is my reading on a 33 OHM resistor. I don't think my multimeter's battery is good still. I'm going to replace it. :P