I managed to increase the effective resolution of a 10 bit ADC by several bits by repeatedly sampling a noisy input and taking an average over many samples.
I'm wondering what some techniques are for creating this sort of "noise" purposely to achieve the same effect. Obviously the simpler the circuit the better. There's probably a name for this technique that I'm not aware of?
I have some spare digital output pins on the microprocessor that I could some how use to generate the "noise". In case it matters, I'm using a 10 bit ADC with full scale 0-5v to measure the voltage across a thermistor (50k) in series with a 10k resistor, where the thermistor resistance will vary from about 7k to 160k.