I'm looking for an analog method of measuring the phase difference between two signals operating at frequencies in the range from (0 – 20 MHz). I'm wondering if there's an IC that does that or a specific circuit that converts the phase difference into a voltage signal.
Thank you very much
With more specifics on input and output Voltage range, a better answer can be provided.
To measure phase, as Steven says assuming they are equal amplitude and linearity you can subtract but this is a time-variant signal not DC phase output, so one might use a Peak Detector to rectify that signal to mix the result to generate a DC voltage for Phase difference.
The amplitude needs to be normalized (the same) so linear slicers or limiters are used as well as XOR gates ( which is a logic gate that also works here as a mixer/phase detector for logic level signals.
THere are many other ways too such as edge detect, S&H sawtooth clock and Time Interval counters.
.. A better way that I suggest is the 4046 PLL chip.
Do you want 0~180 deg = 0 to Vdd? then use TYPE I "XOR gate" chip or 0~360 deg then use the Type II edge detect phase detector.
CMOS 4046 PLL chip is very easy to use, and has been around since mid 70's, when I first used it.