Electronic – Phase noise measurement with IQ mixer

microwavenoisenoise-spectral-densitysignal

I have a question regarding a measurement scheme of phase noise that I'm trying to implement. The idea is that I have two identical signal generators (I actually do) that generate a sinusoidal voltage signal, say

$$V(t) = V_a(t) cos(2 \pi \nu + \phi(t))$$

where \$V_a(t)\$ is the amplitude, nu the frequency (I'll just assume that this is implemented perfectly for now) and \$\phi(t)\$ the phase, where the time dependencies are due to the noise; ideally both the amplitude and phase would of course be static values but in practice they are not. In general this noise has a vanishing mean, so I could for example write

$$V_a(t) = V_a + \delta V_a(t)$$
$$\phi(t) = \phi + \delta \phi(t)$$

where \$ \langle V_a(t) \rangle = V_a\$ and \$\langle \phi(t)\rangle = \phi\$. With this the signals produced by the generators become

$$V(t) = (V_a + \delta V_a(t)) sin(2\pi \nu + \phi + \delta \phi(t))$$

Now, the purpose of my investigation is to characterize \$\delta \phi(t)\$: I am trying to find its power spectral density. The way I do this is by measuring the voltage signal in a time series and using (discrete) Fourier transforms to find the voltage spectral density and then the power spectral density.

However, in general the noise is of course quite hard to distinguish when it competes with the actual time alternating part due to nu. So to get this out, we use an IQ mixer with an LO and RF signal at the same frequency, and we use the down converted signal at their difference (=0) frequency to get a DC signal, which is actually slightly time varying due to the phase noise.

So then (I think) we would have a signal of the type (setting \$\phi = 0\$ for convenience)

$$V(t) = (V_a+\delta V_a(t)) sin(\delta \phi(t))$$

But here is where my question comes in. We only want to measure the phase noise, not the amplitude noise, and I don't understand how this is achieved. The scheme we use is depicted here

enter image description here

It uses an IQ mixer (inputs LO and RF) with outputs I and Q, but we terminate Q and only use the output from I. Somehow this gives a signal in which the amplitude oscillations are not relevant, according to my supervisor. Personally, I don't see this however. I feel like the amplitude dependence is still going to be carried in there.

However, the argument my supervisor is giving me (which I don't follow) is that we tune our signal in the I-Q plane such that the mean is along Q, which means that if we then measure I it is insensitive to the amplitude compared to the phase. We do this by using the voltmeter: while the two signal generators have the same clock, their phases are still not identical, so we set a relative phase on one of the two until the voltmeter shows (nearly) 0.

Could someone help me understand (preferably with an equation or two) why this is indeed not sensitive to the amplitude (modulation) noise?

Best Answer

The mixer output should not be sensitive to small variations in amplitude on the LO path. So, any amplitude error in the measurement will be dominated by the amplitude error from source SG2. In general, the amplitude error of a good signal generator is very low. So, the error you measure is almost entirely a combination of the phase errors from both sources.

As a check, you can measure the amplitude error by itself using a single signal generator. Split the output and apply the signal to both the LO and RF inputs.