Electronic – Is analog signal arithmetic faster than digital one

analogcomputer-architecturedigital-logicfloating pointsignal processing

Would it be theoretically possible to speed up modern processors if one would use analog signal arithmetic (at the cost of accuracy and precision) instead of digital FPUs (CPU -> DAC -> analog FPU -> ADC -> CPU)?

Is analog signal division possible (as FPU multiplication often takes one CPU cycle anyway)?

Best Answer

Fundamentally, all circuits are analog. The problem with performing calculations with analog voltages or currents is a combination of noise and distortion. Analog circuits are subject to noise and it is very hard to make analog circuits linear over huge orders of magnitude. Each stage of an analog circuit will add noise and/or distortion to the signal. This can be controlled, but it cannot be eliminated.

Digital circuits (namely CMOS) basically side-step this whole issue by using only two levels to represent information, with each stage regenerating the signal. Who cares if the output is off by 10%, it only has to be above or below a threshold. Who cares if the output is distorted by 10%, again it only has to be above or below a threshold. At each threshold compare, the signal is basically regenerated and noise/nonlinearity issues/etc. stripped out. This is done by amplifying and clipping the input signal - a CMOS inverter is just a very simple amplifier made with two transistors, operated open-loop as a comparator. If the level is pushed over the threshold, then you get a bit error. Processors are generally designed to have bit error rates on the order of 10^-20, IIRC. Because of this, digital circuits are incredibly robust - they are able to operate over a very wide range of conditions because the linearity and noise are basically non-issues. It's almost trivial to work with 64 bit numbers digitally. 64 bits represents 385 dB of dynamic range. That's 19 orders of magnitude. There is no way in hell you are going to get anywhere near that with analog circuits. If your resolution is 1 picovolt (10^-12) (and this will basically be swamped instantly by thermal noise) then you have to support a maximum value of 10^7. Which is 10 megavolts. There is absolutely no way to operate over that kind of dynamic range in analog - it's simply impossible. Another important trade-off in analog circuitry is bandwidth/speed/response time and noise/dynamic range. Narrow bandwidth circuits will average out noise and perform well over a wide dynamic range. The tradeoff is that they are slow. Wide bandwidth circuits are fast, but noise is a larger problem so the dynamic range is limited. With digital, you can throw bits at the problem to increase dynamic range or get an increase in speed by doing things in parallel, or both.

However, for some operations, analog has advantages - faster, simpler, lower power consumption, etc. Digital has to be quantized in level and in time. Analog is continuous in both. One example where analog wins is in the radio receiver in your wifi card. The input signal comes in at 2.4 GHz. A fully digital receiver would need an ADC running at at least 5 gigasamples per second. This would consume a huge amount of power. And that's not even considering the processing after the ADC. Right now, ADCs of that speed are really only used for very high performance baseband communication systems (e.g. high symbol rate coherent optical modulation) and in test equipment. However, a handful of transistors and passives can be used to downconvert the 2.4 GHz signal to something in the MHz range that can be handled by an ADC in the 100 MSa/sec range - much more reasonable to work with.

The bottom line is that there are advantages and disadvantages to analog and digital computation. If you can tolerate noise, distortion, low dynamic range, and/or low precision, use analog. If you cannot tolerate noise or distortion and/or you need high dynamic range and high precision, then use digital. You can always throw more bits at the problem to get more precision. There is no analog equivalent of this, however.