Electronic – how is the 1PPS signal generated in GPS receivers

gps

The question might seem stupid, but the time of flight from the satellite is unknown and moving, the doppler effect is skewing the frequency (and is not constant due to the relative trajectory) and satellites are constantly coming and going to view, forcing a constant change of the reference clock.

Moreover, the 1pps is meant to have low jitter but also be in phase with the UTC clock (but I don't think it is claimed to be accurate in this aspect, it's mostly famous for low jitter).

Here is a secondary question: is the 1PPS phase connected to the carrier wave or to the demodulated signal?

Best Answer

The receiver maintains its own internal timebase, and some of the unknowns that it needs to solve for are the frequency and phase offsets between that local timebase and "GPS system time" as inferred from the received signals.

Once the receiver has those values, the 1PPS output is generated from that timebase. There is no "direct connection" to either the carrier or the modulation of any of the satellite signals — there is significant Doppler shift on those signals anyway.

Low-end receivers use an inexpensive TCXO (temperature-compensated crystal oscillator) to drive the local timebase, but no attempt is made to frequency-lock that oscillator to GPS time. As a result, the 1PPS output may have some jitter in it, with the peak-to-peak value related to the period of the oscillator (usually on the order of 20 - 50 nsP-P).

Fancier high-precision receivers have the option of frequency-locking the local oscillator to GPS time, eliminating the jitter.

In my applications, I usually use low- to mid-range receivers that do not frequency-lock their timebases. In some of my designs, I don't care about the low-level jitter. In other designs, I have used a second PLL as a jitter filter to create my own low-jitter timebase.