Electronic – Maximising entropy generation rate via an ADC’s reference voltage

noisesensor

I have a circuit that generates random noise which I've measured with a 10 bit ADC. The following is the sample distribution:-

sample histogram 2.5V

You will notice that it's a log normal distribution, characteristic of a Zener diode in avalanche breakdown. The maximum reading can be 1023 analogue units. You will also notice that not all of the horizontal scale is utilised as the maximum reference voltage for the ADC was 2.5V. So ADC reading 0 = 0V and ADC reading 1023 = 2.5V. I can alter this reference voltage and as I do the histogram widens and narrows proportionately. The second histogram is for a reference voltage of 1.1V. You can see that approximately 14,000 samples were at a value of 1023 or above. Both histograms show 10 million samples each. (I think that there might be weirdness with the gnuplot tally).

sample histogram 1.1V

As the samples are taken, they are effectively a source of random entropy. This leads to a Shannon entropy rate per sample, in say bits /sample. So for example at Vref = 2.5V, entropy was generated at 0.98 bits/ sample. In order to maximise efficiency of this entropy generator, I wish to maximise the entropy rate it produces by altering the reference voltage.

Following a comment: You will see that in the first histogram, one standard deviation of results (68% – the majority) is spread over perhaps 100 analogue units. In the second, it's spread over maybe 250. That means more entropy, but some readings are clipped at 1023. I think that there is a sweet spot whereby the distribution can be scaled (by reducing Vref) to maximise entropy, which will then fall thereafter as Vref decreases towards 0V.

Q. What reference voltage will maximise the entropy generation rate in bits /sample?

Note: I am not asking how to build a random number generator. I am not asking how to build a random number generator. I write it twice so that answers do not consist of brain dumps on how to build random number generators, debiasing or that a pseudo random number generator would be better. I am looking for a maximisation of Shannon's information entropy formula (or min. entropy which would be better for cryptographic purposes) specific to this question. No commercial TRNG creates it's final output from hardware alone. All use software processing and randomness extraction in distribution whitening phases. I am asking about the hardware phase to maximise entropy. It is important to distinguish between entropy and uniformly distributed random numbers. They are not the same. Why am I having to state this? This is the 3rd time I've asked this question across 3 different forums and not received a relevant answer. This is a mathematics and voltage question to which the answer will consist of just one singular number of volts. Could it be that multi-disciplinary questions are unsuitable for SE?

Best Answer

To answer your original question, you can easily find the optimum value numerically using Monte-Carlo simulation. Generate a sample of your distribution clipped at 1023 for different Vref values, and use an appropriate optimization algorithm (I'd recommend Golden-section search for its robustness) to discover which value of Vref gives you the maximum entropy.

However, I believe you can get even better results if you improve your signal at its source instead of being stuck with the log normal distribution you got. I'm too lazy to do an in-depth analysis for you, but it looks that you can get a much better distribution if you run your signal through a log amp, perhaps using an offset to discard low voltage samples:

schematic

simulate this circuit – Schematic created using CircuitLab