ADC conversion timing

adcmicrocontrollersample and hold

Why the conversion time of an ADC is calculated as: number of bits in resolution+number of programmed sample clocks+2/(ADC clock frequency) ? As far as I know for the successive approximation A/D converter, a clock cycle is the time it takes for a bit so number of bits in resolution would give us n cycles, so why this number is added to 2 and the whole thing is divided to the clock frequency? The microcontroller is HCS 12 and uses successive approximation A/D converter

Best Answer

The initial 2 clocks are to sample the input signal onto a capacitor.

Then there is a programmable time (The programmed sample clocks) where the sample is transferred to the capacitors that form the A/D.

Lastly the successive approximation algorithm then takes one clock per bit as you say.

Total conversion time

The division by the clock frequency is to convert the result to time (in us if you use the frequency in MHz) rather than a number of clocks.

The following application note has the details: AN2428