Electronic – Accuracy vs Precision

accuracyprecision

Does it ever make sense to have more precision than accuracy? E.g. if you had a variable resistor which you could control with a resolution of 0.01 ohms but your multimeter only had 1% accuracy, then that extra resolution would be pretty useless wouldn't it?

Best Answer

Absolutely. There are many applications in which you're much more interested in the relative changes in a signal than it's absolute magnitude.

Take digital audio, for example. It is the precision of the converters that gives you the high signal-to-noise ratio you're interested in. The absolute accuracy (i.e., whether full-scale is 1.0 or 1.1 V) is not generally all that important — it just means the signal overall sounds a little softer or louder, which you can compensate for by adjusting the volume control.