Electronic – How to Calibrate 32.768kHz crystal for PIC24 RTCC


I'm trying to figure out the best method for PIC24 RTCC crystal calibration.
Their application note states two methods: using a lookup table and using a reference system clock.

According to them the reference system clock method is best, but they recommend a system oscillator that is a multiple of the RTCC crystal oscillator, like 16.777MHz.

Has anyone actually tried this RTCC crystal calibraiton process for PIC24?
I would appreciate some practical guidelines.
I'm using PIC24FJ128GA006.

Best Answer

Calibrating against the mains frequency, as Tony suggests, is a bad idea. Long-time accuracy may be good, short-time accuracy isn't.

Tony is dismissive about my reference, but that's no problem, there are other sources which confirm this. (Note that he does use my reference to show an absolute accuracy of 10 mHz/50 Hz = 0.1 ppm (sic). It looks like he is so preoccupied with his 10\$^{-10}\$ that he doesn't see a factor thousand error.) Maybe he accepts the authority of the ENTSOE, that's the "European Network of Transmission System Operators for Electricity". They should know. From this document:

Activation of PRIMARY CONTROL. PRIMARY CONTROL activation is triggered before the FREQUENCY DEVIATION towards the nominal frequency exceeds \$\pm\$20 mHz.

Maximum Permissible Quasi-Steady-State Frequency Deviation after Reference Incident. A quasi-steady-state FREQUENCY DEVIATION of \$\pm\$180 mHz away from the nominal frequency is permitted as a maximum value in the UCTE SYNCHRONOUS AREA after occurrence of a reference incident after a period of initially undisturbed operation. When assuming that the effect of self-regulation of the load is absent, the maximum permissible quasi-steady-state deviation would be \$\pm\$200 mHz.

This site gives you a real-time view of the deviation.

Even if we ignore the 200 mHz incidents there are still the 20 mHz deviations. We're talking about 400 ppm, that's more than an order of magnitude than the error of the uncalibrated crystal. 4000 ppm or two orders of magnitude taking the reference incidents into account. So the conclusion remains the same: the line frequency's short-term accuracy is by no means good enough to calibrate a crystal.
end of edit

The graph shows that a 50Hz mains frequency continuously fluctuates between 49.9Hz and 50.1Hz, that's a 0.2% error, or 2000ppm. An uncalibrated watch crystal is 20ppm accurate. (Horizontal scale is days.)

This device may be of help:

enter image description here

It's a Chip Scale Atomic Clock which outputs a 10MHz square wave with 1.5 \$\times\$ 10\$^{-10}\$ accuracy, several orders of magnitude more accurate than TCXO (Temperature Controlled Crystal Oscillator). Tune your oscillator so that you get 10 000 000 pulses from the CSAC over 32 768 cycles of your crystal.

Only 1500 dollar, which sounds like a bargain to me. (Your own fault, you should have mentioned a budget :-))

Cheaper? OK, this OCXO (Oven Controlled Crystal Oscillator) has 5ppb (0.005ppm) frequency stability and less than 0.1ppm aging per year. About 150 dollar. Available in 16.384MHz, which is a multiple of 32.768kHz (500x). You mentioned this in your question, though there's really no reason for this.

Some GPS receivers have a 1 PPS (Pulse Per Second) output, which should have high accuracy as well. You would have to count cycles of your own 32.768 kHz clock over at least 30 seconds to get at 1 ppm accuracy. Ideally a single second will get you 32 768 counts \$\pm\$1 count, which is only a 30 ppm resolution.