Electronic – Calculating latency for ADC chips

adcaudiolatency

I am building a realtime based audio processing system that needs to sample 48KHz, 16bit audio, preferably with <0.5ms (the lower, the better) latency (Measured from analog signal -> data sampled and ready to be transferred to host). I will not need to do any DSP processing.

The question I have, is this: How do I go about calculating the latency introduced by an ADC circuit?

I've looked through several datasheets and don't see any information that suggests the total delay of the ADC (makes sense, because it kinda depends on the sampling frequency). I'm not considering the latency of the host processor, that's been figured out.

Here's the approach I was thinking… This could be completely wrong.

  1. Intrinsic delay of the circuit (typically negligible, usually in ns)
  2. Group delay of the decimation filter used for sampling
  3. Total 'size' (in time) of the buffer where samples are stored.

For example:

  • Circuit Delay: (Negligible)
  • Decimation Filter Group Delay: (200us)
  • Buffer size: 30 samples
  • Sample Frequency: 1/48000 (period of 20us)

Assuming the decimation filter's group delay is 200us, would the total delay be 200us + (30 samples * 20us) = 800us?

Here is the datasheet for an ADC I was looking at:
TI TLV320ADC3101

Please, feel free to share any additional information, advice, or to correct me where I am wrong.

Best Answer

In all likelihood, the ADC will: -

  1. Sample the signal
  2. Convert the signal
  3. Feed out the digital representation of the signal
  4. Repeat

This means the latency is approximately the time between consecutive samples. More sophisticated ADCs could be (or can be) sampling the input signal whilst feeding out the digital value of the previous sample but, given that sampling is usually quite short I'd say you can use the inter-sampling time as the latency.