Does it ever make sense to have more precision than accuracy? E.g. if you had a variable resistor which you could control with a resolution of 0.01 ohms but your multimeter only had 1% accuracy, then that extra resolution would be pretty useless wouldn't it?
Electronic – Accuracy vs Precision
accuracyprecision
Related Topic
- Electronic – How to measure bipolar analog signal accurately (to 1mV) on raspberry pi
- Electrical – precision current sink for battery discharge
- Electronic – DC motor RPM accuracy/precision
- Electronic – *Very* high precision, *very* stable current source
- Electronic – Precision Adjustable Voltage Reference
- Electronic – Accuracy of manufactured photocells (LDR) with respect to light intensity
Best Answer
Absolutely. There are many applications in which you're much more interested in the relative changes in a signal than it's absolute magnitude.
Take digital audio, for example. It is the precision of the converters that gives you the high signal-to-noise ratio you're interested in. The absolute accuracy (i.e., whether full-scale is 1.0 or 1.1 V) is not generally all that important — it just means the signal overall sounds a little softer or louder, which you can compensate for by adjusting the volume control.