Electronic – ADC acquisition time

adctime

I need to take one sample of a fast changing signal at exact time with high precision. Slow sample rate ADC's have better accuracy, but I wonder about actual aquisition time – the time at which ADC is doing sample and hold. I guess it effectively averages the signal between some points in time. Can a slow (say 10 ksps) ADC take a fast snapshot, say less than 1us? Or is it always related to samples per second? 1 msps ADC obviously does sample and hold in less than 1us, but how about 100ksps, 10ksps ADC? After looking at some datasheets, I'm still confused.

Best Answer

The parameter I think you want is in the datasheet of high performance ADCs - it's called "Input Bandwidth". You may want to also consider the "Aperture Delay" and the "Aperture Jitter" which define how closely, and how consistently the sample is done relative to the clock edge. Aperture jitter effectively adds noise.

enter image description here

It becomes very important when you are doing undersampling. If you are interested in a relatively narrow bandwidth in a relatively high frequency signal you can undersample and bandpass filter- say you have a 100MHz signal and are interested in 50kHz bandwidth- you can sample the 100MHz signal at 100kHz and Nyquist is still satisfied. Of course you'd want to bandpass filter the signal before doing this. You cannot do it if the samples are smeared out over time so the input bandwidth has to be better than 100MHz for it to work well.

Here is another example, this one from a 50 ksps to 200 ksps SAR converter:

enter image description here

Here they refer to the full power bandwidth. Note that an inexpensive converter that can handle at most a 100kHz bandwidth without aliasing has a full-power bandwidth of as much as 11MHz.

Delta-sigma converters (the other most common type) behave quite differently- very long latency and group delay. They are effectively oversampling converters followed by filters.