How long does analog to digital conversion take

signal processing

I am analyzing Voltage signals from many channels which have passed through a multiplexer for analog to digital conversion, before reaching the computer. I am wondering how long the conversion takes, and whether there is a correction applied i.e. interpolation of the digital signal to give exact correspondence in time across signals.

Perhaps the answer is device-specific although I would have thought that someone might know roughly how long it takes per signal…

Best Answer

There are a number of different types of ADC available on the market, using vastly different technologies.

How long a conversion takes depends on the technology used.

The three most common types you find these days are:

  • Successive Approximation

These are the most common type found in microcontrollers and similar devices. The speed is governed by a clock - often an incoming communication clock for external SPI or I2C controlled ADC chips - so the speed of them is usually limited to the speed of the communication bus. It usually takes as many clock cycles to capture a value as the sampling resolution, so a 10-bit ADC would take 10 clock cycles to sample. Typical rates are in the hundreds of Ksps up to 1.1Msps in some MCUs.

  • Sigma Delta

These are higher quality than the SAR ADCs above, but a lower speed. They are usually used in audio applications, such as the line-in on your sound card. They take multiple samples then average them (over-sampling) to create a higher resolution value than is actually sampled.

  • Flash or Direct Concversion

These are the big boys. They grab a value and directly convert it in one clock cycle into a binary value. They are fast - very fast - and consequently are also expensive. They are the kind of ADC used in things like oscilloscopes, video processing systems, etc. They can sample in the multiple Gsps range.

So how fast your ADC is depends on what type you are using.