Electronic – Considering analog meters as displays, which would be more versatile – voltmeters or ammeters

analogmetermicrocontroller

Note: I'm here to see if there's an objective reason to choose one type over the other.

I've got some projects in mind that would use a microcontroller to drive one or more analog meters as displays, either with PWM data or a direct D/A drive up to 3.3V.

3.3V voltmeters are relatively had to find (one could scale to 1V though), but a bigger consideration is responsiveness (or lack thereof, with damping).

Is there an objective reason one would choose an ammeter versus a voltmeter when driving it from the output of a microcontroller?

Edit: I'm reminded that both types of meter are fundamentally current-sensitive mechanisms. That said, it may be that one type of meter is easier to convert to its complementary usage (without internally modifying it), so that is consistent with this question.

I've found a variety of projects out there using analog meters – some which use voltmeters and some which use ammeters – but I've yet to see a rationale mentioned for the choice of one over the other. Thus my question here.

Best Answer

Meter movements with less sensitivity tend to be easier to work with, to a point anyway. Something in the 1mA full scale region is better than a 25uA meter movement, under vibration etc.

Note that the base movement responds to current- voltage meters have a resistor in series.

Electromechanical damping is related to the impedance the meter movement "sees" driving it. A shorted movement will be damped more than a movement driven with a current source. A meter sold as a voltmeter will have a relatively high value resistor internally in series with the coil so the damping will be limited. A voltage meter that is designed to lightly load the source (high ohms-per-volt in multimeter terms) will have a relatively sensitive meter movement and thus will be less robust than one that requires more current (lighter spring, more sensitive to jewel friction or band hysteresis etc).

Unfortunately, driving the movement itself with a low impedance constant voltage source causes it to be relatively sensitive to temperature- copper has a tempco of about +3950ppm/K. If you are using a microcontroller you could sense temperature and adjust the drive to roughly compensate (to the extent that your sensor matched the coil temperature). Where such was required in the old days, we would sometimes use an NTC thermistor in series with the coil, for example for a millivoltmeter sans amplifier.

I suppose they could have wound the coil with Manganin or Constantan resistance wire rather than copper, but I have never seen that.


In the above, I'm referring to moving coil meters, as opposed to moving iron types.