Electrical – Measuring the precision of a clock

clockprecision

I have worked on a project using high precision clocks for some time now. I have yet to find a clear description on how manufacturers and scientist confirm and verify the precision of clocks like OCXOs.

My approach have been to use time and test two clocks relatively to each other. As time goes, the error between each clock will sum and create a bigger uncertainty.

But how is this done with cutting edge Atomic clocks within NIST standards ?

Best Answer

As with anything that is adjustable, when comparing it and calibrating it against something else that is adjustable, how do you know what is "right"? What is considered "more accurate" if everything is just compared against something else that is compared against something else? There has to be something at the end of the chain where you say "This is exact". Some base value that all others are, ultimately, compared to.

And that is the Caesium atom.

I found a good description of how they work online:

Inside a cesium atomic clock, cesium atoms are funneled down a tube where they pass through radio waves . If this frequency is just right 9,192,631,770 cycles per second then the cesium atoms "resonate" and change their energy state.

A detector at the end of the tube keeps track of the number of cesium atoms reaching it that have changed their energy states. The more finely tuned the radio wave frequency is to 9,192,631,770 cycles per second, the more cesium atoms reach the detector.

The detector feeds information back into the radio wave generator. It synchronizes the frequency of the radio waves with the peak number of cesium atoms striking it.

A good caesium clock has a precision of in the order of \$\frac{1}{3\times10^{15}}\$ which is far better than any crystal, OCXO, TCXO, or otherwise.

So having that as a baseline you can now calibrate other systems against it. And the further systems against those. The higher the accuracy of your nominal frequency you want the "closer" to that source you want to get.

But as has already been mentioned in the comments, that's only half the story. The whole purpose of OCXO or TCXO is not to make a crystal oscillate more closely to that precise "source" reference frequency, but to keep the it oscillating at a fixed frequency. A crystal's resonant frequency drifts and changes depending on temperature. By either controlling the temperature (OCXO) or compensating for the changes in temperature (TCXO) you can either reduce or negate that drift.

Very often it doesn't matter one jot if you are a few Hz out when dealing with MHz or GHz frequencies. What matters is that you stay that same few Hz out and don't drift. It doesn't matter (for instance) that everyone would have to tune their TV to 512.000038MHz (that's what "fine tune" is for), but people would get annoyed if they had to keep re-tuning between 511.999381MHz and 512.000482MHz all the time depending on the weather.