Electrical – Multimeter diode mode: do different multimeters show the same results approximately

measurementmultimeter

I think that in many cases the voltage that a multimeter shows in diode mode is dependent on the current it uses for that mode. So the values measured by different multimeters in diode mode should be different in many cases. Am I right? If that's the case, why don't they make a universally accepted standard, so that all multimeters use the same amount of current in diode made and the measured values can be compared (which could be helpful in diagnostics)?

Best Answer

Typical multimeter currents are about 1 mA, with a fair amount of variation, and this is not standardized that well, but it's not a massive issue given the exponential I-V characteristic of properly-functioning diodes under forward bias.

Furthermore, if I'm using a multimeter for a diode test, I'm not looking for a precise value anyway. Instead, I'm looking to see whether I am hitting a diode drop at all. In practice, if I really cared about my diode's saturation current, I'd grab something more precise than a multimeter.

The current in a forward-biased diode is given by:

$$I_D = I_S\cdot e^{\frac{V_F}{nV_T}}$$

where \$V_T\$ is the thermal voltage (about 26 mV at room temperature) and \$n\$ is either one or two depending on semiconductor type. \$I_S\$ is a constant current associated with that diode and environmental conditions (e.g. temperature).

It follows from this math that the forward voltage of the diode will vary very little even for moderate differences in current; in the worst case (ideality factor of 2 corresponding to indirect bandgap semiconductors), a 10x difference in current between two meters will only yield a ~0.1 V discrepancy.