Electronic – How does the multimeter measure the voltage?

multimetervoltage

I am using the fluke 88 multimeter and I know this digital multimeter measure the voltage drops when measuring the current(ampere) in a series connection because the current flows into the multimeter(because it is a series) and it is straight to understand.

However, I have no idea how multimeter works when measuring the voltage with parallel connection.

In brief, I would like to know how multimeter checks the voltage, current and ohm..

Thanks.

Best Answer

Most DMMs use a dual-slope integrating analog to digital converter, such as the classic Intersil ICL7106. There are many more modern variations on this chip, but since this was one of the first, it has extensive applications notes available. (I don't know which chip is used in the Fluke 88, it's definitely something newer than ICL7106, but I'll just refer to ICL7106 since it's pretty much the archetype for all dual-slope DMMs.) Here's a brief overview of its principles of operation:

A capacitor is quickly charged up to the input voltage, then that capacitor is discharged at a controlled rate while a timer counts how long it takes the voltage to return to zero.

ICL7106 integrator waveform from Intersil Renesas ICL7106 datasheet ICL7106 integrator waveform from Intersil Renesas ICL7106 datasheet

That "deintegrate phase" timer count determines the actual number that is displayed on the DMM's readout. A "3 1/2 digit" DMM counts up to 1999, and a "4 1/2 digit" DMM counts up to 19999. Your Fluke 88 is a 6000 count meter in 4/sec mode, and 19999 counts in high-resolution mode.

The integrating capacitor needs to be low dielectric absorption, so typically it would be mylar / mica / poly instead of ceramic or electrolytic. It's important that all of the charge that is stored in the capacitor during the sampling interval, gets counted during the deintegrate phase, otherwise the DMM will not be accurate.

The display counts are directly driven by the ICL7106, which contains a driver specifically for triplex Liquid-Crystal Display (LCD). There are other variants of this chip for driving LED, or for reporting the count directly to a microcontroller.

How does it know how much a Volt is?

The DMM contains a voltage reference. This may be as simple as a Zener diode (for cheap handheld 3 1/2 digit meter) or as complex as an oven-controlled voltage reference IC. This is one of the performance limitations of the instrument, be sure to look in the user's manual. There will be a table that describes how accurate the meter actually is. For example, +/- (0.1% + 1) means a gain error of up to 0.1% of the actual value, plus may be off by one count, and that is all plus-or-minus. So if the meter is properly calibrated and it reports 12.34V, then the real input voltage is somewhere in the range between 12.31766V to 12.36234V.

Fluke 88 spec table Fluke 88 specs table, see page 68

The ICL7106 has an auto-zero phase that tries to eliminate offset errors, but its gain error depends on the performance of the voltage reference. This is one of the performance tests that would be checked by a professional calibration shop.

How does it measure a voltage other than the 2V range?

The ICL7106 itself expects to measure a signal in the range between 0.001V and 2V, so to measure higher or lower voltages, some additional circuitry is used to scale the external input signal. This range selection circuit could be a mechanical rotary switch or a bank of relays. The range selection also adjusts where the decimal point is shown, and may also determine what units are displayed.

Measuring a 20V signal can be done using a 10:1 resistive divider, 12.34V would be divided down to 1.234V which would display as 1234 counts. The decimal point would be placed in the hundreds place "12.34V".

Measuring 200V can be done using a 100:1 divider, 123.4V would be divided down to 1.234V which would display as 1234 counts. The decimal point would be placed in the tens place "123.4V".

Measuring a 0.2V signal requires using an amplifier. 123.4mV would be amplified by 10V/V up to 1.234V which would display as 1234 counts. The decimal point would be placed in the ten-thousands place ".1234V" or in the tens place as "123.4mV", depending on what units are used.

How does it measure current?

If measuring unknown current, an internal shunt resistor of known value is used, then the voltage developed across the shunt is what is displayed by the DMM. So if the internal shunt resistor is 10 ohm, and the unknown current is 12.34mA, then the voltage is 123.4mV or 0.1234V. This would appear on the display as 1234 counts, so the decimal point would be placed in the hundreds place "12.34".

This is complicated by the burden voltage, the actual voltage developed across the current sensing shunt resistor. Too much burden voltage can affect the thing being measured, but too small burden voltage makes the signal too small to measure.

The shunt resistance also changes with temperature and is self-heating. Power dissipation is also a factor, that's why there is often a separate 10A range which has its own shunt resistor. These are always fuse protected (or else).

How does it measure resistance?

If measuring unknown resistance, an internal current source of known current is driven through the unknown resistance. The voltage developed across the unknown resistance is what is displayed by the DMM. So if the internal current source is 10mA and the unknown resistor is 74.5 ohms, then the voltage is 745mV or 0.7450V. This would appear on the display as 745 counts, so the decimal point would be placed in the tens place "74.5".

Continuity and diode ranges are also based on the same principle.