Electronic – Clock of GPS receiver

clockgps

As I understand, GPS is supposed to work because :

  • You have 4 (or more, but let's say 4) satellites whose clocks are extremely highly synchronized (through atomic onboard clocks)

  • You have a receiver that receives 4 pings / messages from these satellites, each containing a timestamp of date of emission wrt to the "GPS time"

  • Supposing the receiver time / clock only has a bias wrt to GPS time (ie, a fixed, constant error), you can from these 4 signals and the map of satellites position, deduce your 3 cartesian coordinates + the time bias error of your receiver clock wrt to GPS time.

That's all fine in theory, but in practice :

  • The timing accuracy we are looking for is in nanoseconds (as there is c involved and the goal is to achieve metric-like accuracy on pseudo-ranges)

  • So is it technologically "easy" to ensure that the actual "sending" of the signal by the GPS satellites is at the actual timestamp of the atomic clocks ? I mean, it's going through an electronic system, then a real-world, physical, analog antenna. I'm not talking about adding a known delay (which is not a problem as it can be accounted for as long as it is known), but what about unknown delay / jitter ? For example, is it "easy" to ensure that the analog antenna's emission date is within requirements / specs ?

  • Same question with the antenna / circuitry of the receiver ?

  • Additionally, on the receiving end, there is also problem of jitter / repeatability of the "time bias" wrt to the GPS clock. Since we're actually looking for nanoseconds, this seems not obvious. I get / guess the receiver has a well-known stable quartz (or similar) low frequency clock and a jittering VCO for nanoseconds granularities ? And the jitter of this VCO is low enough to allow ignoring it when solving the GPS equation system supposing a fixed biais ?

Of course, feel free to move the question elsewhere if Electrical Engineering is not the adequate community (in the end the question is I think more about pure circuitry than aeronautics, so I chose to ask it here, but I may be wrong).

Best Answer

So is it technologically "easy" to ensure that the actual "sending" of the signal by the GPS satellites is at the actual timestamp of the atomic clocks ?

Delays of onboard electronics are no concern, as long as they are constant. The clock offset of the space vehicles z-count against GPS-time is compensated as a whole. Variable delays, such as jitter, are a concern, but atomic clocks have excellent phase noise performance.

Phase center stability of the antenna is also a concern, the signal should (seem to) originate from a well defined point in the space-vehicle. This is by no way easy, f.e. the DoD got it wrong when they hooked a L5-demonstration-payload on SVN47. This equipment introduced an elevation dependant phase shift on the signal, rendering the whole space-vehicle unusable for navigation purpose. (read the story at InsideGNSS).

Same question with the antenna / circuitry of the receiver ?

The receiver uses one Antenna, one LNA, one VCO, one mixer, one filter and one ADC for all the signals. Dispersion (delay depending on frequency) is no concern, all signals occupy the same frequency. Any delay will affect all the signals the same way, the relative timing of the signals is not affected. Delays will only result in a local clock error, not in a position error.

Local oscillator phase noise will also result in local clock error and not affect positioning. It can seriously restrict the receivers ability to track the signal, but relatively cheap temperature compensated crystal are OK.

The receiver does not take timestamps, it rather evaluates the relative phase of the signals. In order to cope with such a high frequency, L1 (1575.42MHz) is downconverted to an IF of, lets say, 4.096 Mhz and sampled at 10MHz (equivalent to 100ns) clock. It is important to understand that this downconversion does not affect position accuracy, as one cycle phase shift of L1 translates into one cycle for IF.

This way, the receiver can easily detect a phase shift of one cycle, which corresponds to 19cm line of sight. (So why do receivers not have 19cm DOP? The answer does not fit here).

Direction dependant delay in the antenna is a concern for precision GPS, as is the reception of reflected signals (multipath). Precision receivers use choke-ring-antennas or even fractal element antenna to mitigate.

I get / guess the receiver has a well-known stable quartz (or similar) low frequency clock and a jittering VCO for nanoseconds granularities ?

Phase noise (power spectral density) of a TCXO is around -100dBc/Hz at 100Hz offest from nominal. The PLL/VCO will add a few dB. This phase noise affects the SNR in a complicated way. As long as you do not track extremely weak signals or need to track fast receiver dynamics (like a rocket guidance), this performance is OK.

More precise oscillators can be used to enhance performance in different ways, for example by using narrower (digital) filters.

And the jitter of this VCO is low enough to allow ignoring it when solving the GPS equation system supposing a fixed bias ?

See above, you need not consider jitter for position errors,as it affects all signals by the same amount.