Electronic – Quick Test of an ADC


I have an 8BIT MAX160 Analog Digital converter I am using in several boards. I was wondering if there was a way to quickly test the accuracy of the ADC and output range of the ADC to verify bit errors, etc.

My idea was to increase the incoming analog signal one bit "count" in magnitude for each iteration in a loop:

ADC Input range 0 – 4.00 VDC

calculated mV/Count: 0.015625


Analog Input (VDC)      Expected Output

0.00            00000000
0.015625        00000001
0.03125         00000010
0.0625          00000100
0.125           00001000
0.25            00010000
0.50            00100000
1.00            01000000
2.00            10000000
4.00            11111111

Test method:

Start at 0VDC to see if any "stuck" bits/lines are present.

Do a "walking 1" for each successive magnitude bit position

Then do 4.0 Max input for All 1's

My question is, would this be a valid "quick" test to run an ADC through, instead of doing every possible input value from 0 to 255?

Best Answer

Expecting a test input signal to produce a bit-accurate ADC output seems unreasonable. Instead you could create an exponential input voltage by turning on charging a capacitor through a resister. After reading in many period results over the expected rise time it shouldn't be hard to apply a test on the data comparing it to the easily predicted results for a gross ACD malfunction. At the very least it should be monotonically increasing and secondarily the delta between sample N and sample N+1 should be decreasing as N increases.