Electronic – ADC conversion clock meaning

adc

When I read that the conversion clock of an ADC is for example 2 MHz. Does it mean that the ADC perform the analog to digital conversion of n-bit every T=1/2MHZ=0.5 micro second?

This is what is written in the text: With a 2-MHz conversion clock, the ADC can perform an 8-bit single conversion in 6 μs or a 10-bit single conversion
in 7 μs.

I do not understand how these micro second times are derived.

Best Answer

In the case of a successive approximation converter, it will perform one conversion step per clock cycle, which actually performs one bit of the conversion. Thus an 8-bit conversion takes at least 8 clock cycles ( 4 us) and a 10-bit conversion takes at least 10 cycles (5 us) with a 2 MHz conversion clock (0.5 us period)

In the first step it performs the MSB conversion, asking "is the value more than half the reference voltage?", setting the MSB to '1' if yes, and subtracting either half the reference or zero (accordingly) from the input value.

In the next step it converts the next bit, asking "is the remaining value more than 1/4 the reference voltage?" and so on for each bit.

There is also some overhead, for tasks like storing the final result from the conversion into the output register, resetting the internal circuitry, and "freezing" the analog value for the next conversion (in a "sample and hold" circuit) to keep it stable during the actual conversion.

The designers of you example converter apparently decided to use 4 additional clock cycles for these tasks, thus giving 6us and 7us total time for a conversion.

There are ADCs which can perform an entire conversion in a single clock cycle, these are known as "flash" converters.