I have a circuit that has a digital square wave input (generated by PLD, 1.8Vp) and a sine wave output (0.5 – 3.5 Vp). Both signals have 100kHz frequency, however the phase is different.
What is a good way to detect the phase difference between these two signals? Phase detectors I've seen so far are for either all digital or all analog signals? Is there one for the mixed signals circuit like the one I have?
Knowing the phase difference with 1 degree is sufficient for my application. The frequencies are always locked relative to each other and never change. The square wave drives the analog electronics and analogs produce the sine wave which has AM modulated signal in it. The amplitude of the signal is, however, very low compared to the amplitude of the carrier. Due to the production variability the analogs (include some hand-winded inductors) have high unit to unit variability of the phase, and I am trying come up with an auto-tuning method for the DSP that processes the output sine wave.
Phase detection is the easiest for digital signals; it's basically an XOR gate. I would convert the sine to a square wave. Feed a comparator with the sine on one input and the averaged sine (LPF) on the other, so that the comparator gives a 50 % duty cycle square wave. Then use a digital phase detector.