I would suggest doing this in a small FPGA. What you want is clockwork it is more suited to an FPGA than a CPU. The VHDL for it is trivial.
You can get cheap FPGA dev boards, e.g. From Actel (Igloo Nano) that have all the hardware on board that you need. You just need to connect a level translator for the MIDI and dot matrix driver, e.g. those given in other answers. The dev board has pins to do that. Instead of a PCB you can even use stripboard. I've connected an LCD matrix display in this way.
Igloo Dev Boar Manual
The modern digital oscilloscopes are sophisticated analog beasts!
Most of the modern day high speed digital and analog equipment, such as computer interfaces (USB, SATA, Gigabit Ethernet) are tested, designed and refined using digital oscilloscopes. Even many SoCs containing complex analog and digital peripherals are validated using digital oscilloscopes. For example USB 3.0 can have speeds up to 5Gig bits per second. The interfaces are literally probed by digital oscilloscope inputs and careful test setups are built around them.
Even high speed analog blocks such as ADCs, Amplifiers, Filters and Oscillators are tested using DSOs.
However from a purchase point of view, these are very expensive oscilloscopes. For the highest analog bandwidth available, the boxes from companies such as LeCroy (part of Teledyne now), Keysight (Changed from Agilent's T&M division), Rhode & Schwartz and Tektronix, may cost a Ferrari!
But for most of hobby use, student laboratory or even a decent embedded testing there are value-for-money oscilloscopes from above companies and many other from around the world. There are also PC based USB oscilloscope products (BitScope, Picoscope or USBee).
Digital oscilloscopes exists because they work! And engineers use them! I use them!
Most of the time, we expect more from a box and potentially use an unsuitable signal for analysis. A high speed square pulse stream on a lesser bandwidth oscilloscope will look smoothed out! Or even as a sine-wave! Because all the higher frequency part of the signal is filtered out on channel.
These are few questions you may want to ask yourself before choosing an equipment.
Ideally every signal is of infinite bandwidth. Only that the higher
harmonics are very feeble. So choose the "Analog bandwidth" of the
scope based on your signal.
Try to use the full dynamic range of the scope (Full bit resolution
vs. full scale). If your interest is about superimposed parts of a
signal, like that sharp glitch on a sinewave output of a switched
power supply, go for higher ADC resolution scopes.
If the signal is small, the scope will amplify it. If the signal is
large the scope will attenuate it to suite the full swing of the
internal ADC. Some times you may want to use the auto-scale feature
of the scope.
If the signal is too small amplitude, then amplifying it will also amplify some noise. If the signal has large glitch, then attenuating it will reduce its details.
We should also look into the merits of Digital Vs. Analog scopes
- Most DSOs have sophisticated Analog Front Ends (AFE). Which is again software controlled and offers extra leverage based on signal. Signal conditioning, amplifying and even isolating are handled in digitally controlled AFE.
- Next to AFE is the heart of a digital scope, which is a high-speed
ADC. This technology has improved leaps and bounds in the last decade.
- There is a ping-pong or daisy-chained RAM buffering of ADC samples before they are pushed to a dedicated computer. If you know DSP, you will know the 'value' of digital samples!
- The raster / rendering of digital signals on a decent UI actually gives ability to have multiple cursors both horizontal & vertical, easy scale adjustments, visualization, attached measurements and mutiple channels in one go!
- I think multiple channel, channel math&logic, advanced triggering capabilities are the most useful features of a DSO.
However if you admire pure analog signals, directly imposing themselves on a phosphor screen, nothing wrong with that too!
Best Answer
Meter movements with less sensitivity tend to be easier to work with, to a point anyway. Something in the 1mA full scale region is better than a 25uA meter movement, under vibration etc.
Note that the base movement responds to current- voltage meters have a resistor in series.
Electromechanical damping is related to the impedance the meter movement "sees" driving it. A shorted movement will be damped more than a movement driven with a current source. A meter sold as a voltmeter will have a relatively high value resistor internally in series with the coil so the damping will be limited. A voltage meter that is designed to lightly load the source (high ohms-per-volt in multimeter terms) will have a relatively sensitive meter movement and thus will be less robust than one that requires more current (lighter spring, more sensitive to jewel friction or band hysteresis etc).
Unfortunately, driving the movement itself with a low impedance constant voltage source causes it to be relatively sensitive to temperature- copper has a tempco of about +3950ppm/K. If you are using a microcontroller you could sense temperature and adjust the drive to roughly compensate (to the extent that your sensor matched the coil temperature). Where such was required in the old days, we would sometimes use an NTC thermistor in series with the coil, for example for a millivoltmeter sans amplifier.
I suppose they could have wound the coil with Manganin or Constantan resistance wire rather than copper, but I have never seen that.
In the above, I'm referring to moving coil meters, as opposed to moving iron types.