Find settling time of ADC

adctime

In an assignment, I need to find the settling Time of the ADC Input Circuit time to reach 1/4 of the LSB.

The ADC has 12 bits. I also know that C=12 pF and R=10 kohm. I found a formula that i used which worked:

t=-ln(1/(2^N*4))*τ where the 4 symbolizes the 4 in 1/4.
N is the wordlength N=12
τ=RC=10*10^3*12*10^-12=1.2*10^-7.
=> t=1164.448 ns

Now the task is to use the 6 most significant bits, still with 1/4 of the LSB, C and R is the same. My original idea was to use the same formula but now with N=6.
t=-ln(1/(2^6*4))*1.2*10^-7=665.421 ns

According to a correcting system this was wrong, so i don't have access to any correct answers.

What am i doing wrong here? How can i do right to find the settling time to reach 1/4 of the LSB when only using 6 most significant bits of a 12 bit ADC?

I see that the ADC is still a 12 bit system, so maybe that is where i am going wrong when i do N=6, changing the ADC to a 6 bit system. But still, i don't know how to specify the time calculation to just find the 6 most significant bits out of the 12.

Best Answer

The problem is that each bit, from most significant to least significant, must have the same analog resolution, so the input settling time requirement is unchanged regardless of how many bits are being acquired.

As an example, let's say you are using a 12-bit ADC with a 10-volt input. 1 lsb is 10v/4095, or 2.422 mV, and the transition of the msb occurs at a nominal 5.00122 volts. The transition of the MSB at a count of 1FFH to 200H must be located to the same resolution as the lsb. If it is not, the entire accuracy of the 12 bit word will suffer.

So acquiring a signal by 1-bit ADC (and such things do exist and are extremely useful - look up "sigma-delta adc") with 12-bit precision requires exactly the same input accuracy as acquiring all 12 bits. It's quicker, but needs to be just as precise.