Timing of delay for autocorrelation processing of a signal

signal processing

When processing a periodic signal to remove noise, by using autocorrelation, a time delay is required. Is there an algorithm that a circuit can "use" to figure out how large this delay needs to be? Does such a circuit – one that can figure out the required delay of a signal for accurate autocorrelation processing – exist? Is there a canonical example? Or does this delay need to be hardwired into the circuit and changed by trial and error?

Best Answer

There's more than one way to do this. The simplest would be to just detect a rising edge zero crossing, kind of like the trigger input on an oscilloscope. Use a low pass filter to keep from triggering on glitches. You could tweak the level if that helps.

If you know that the signal sticks to about the same period, you could build a phase locked loop to build a non-glitchy signal that tunes itself to the same frequency as your signal. It has a variable frequency oscillator whose output is compared with the input signal (e.g. XOR) to detect how far out of phase it is. This is filtered and sent back into the frequency control input.