Electronic – Selecting the equipment with the optimal 10 MHz reference

clockmeasurementreferencetest-equipment

In a larger testbench I have 5 synchronized instruments (signal generators, ARBs, VSAs etc). I am trying to decide which instrument to take as "master". I rule out the older/cheaper parts (Tektronix AFG3253 and HP 8648C) and select between the following high end devices

  1. Rohde & Schwarz SMW200A ARB (with B22 Enhanced Phase Noise)
  2. Rohde & Schwarz SMF100A Signal Generator (with B22 Enhanced Phase Noise)
  3. Rohde & Schwarz FSW (with B4 OCXO Precision Reference Frequency)

Based on 10 MHz reference distribution (daisy chaining vs. BNC tees vs something else) it does not seem to matter too much. But I would still like to decide for the most optimal way.

I have been told I should use the "FSW" because it has "the best" but without further reasoning this sounds like yet another "gut feeling".

I have two general questions ahead:

  1. What does it mean to have a "clean" 10 MHz reference in the first place? Low phase noise/jitter or super stable 10 Mhz wrt to temperature/aging etc?
  2. If it is the latter case: Why would it matter? Say aging: If my setup is stable in short timespans of measurements (say, hours), why would I care if the reference is 10.00000 Mhz or 10.00001 Mhz? All instruments are synchonized anyway, hence the exact value should not matter too much.

Now I looked at the datasheets of the three devices and found that they only show "static" accuracy, temperature, aging … but do not state jitter at all. That's counter intuitive because I assumed that jitter would be my most important criteria for the reasons discussed above.

The datasheet specs are shown here.

enter image description here

I assume aging is not an issue since and I do not care the exact value of the 10 MHz. Furthermore I assume temperature is not an issue because all instruments are running. Looking at the "Achievable initial calibration accuracy" of the FSW versus frequency error of the SMW and Aging/temperature for SMF (the only given spec) I would go for the FSW (5e-9), then the SMW as second choice and surprisingly the SMF as last choice. Jitter is not taken into consideration at all.

Best Answer

What does it mean to have a "clean" 10 MHz reference in the first place? Low phase noise/jitter or super stable 10 Mhz wrt to temperature/aging etc?

Phase noise is usually not the primary concern, because the usual use of the 10 MHz reference input of an instrument is as reference to a PLL generating some other frequency. And this PLL will tend to dramatically attenuate any jitter at jitter frequencies above a few kHz.

Stability against aging and temperature thus does tend to be the critical parameter.

If it is the latter case: Why would it matter? Say aging: If my setup is stable in short timespans of measurements (say, hours), why would I care if the reference is 10.00000 Mhz or 10.00001 Mhz?

In your measurement it might not matter.

If you want to reproduce your result a year later, it might matter.

If you have a requirement for a particular frequency accuracy, and it's been more than a few hours or days since your instrument was calibrated, it might matter.

I assume aging is not an issue since and I do not care the exact value of the 10 MHz.

If your measurement is not sensitive to small errors in the reference frequency, then the accuracy of the frequency reference might not be critical to you.

Looking at the "Achievable initial calibration accuracy" of the FSW versus frequency error of the SMW and Aging/temperature for SMF (the only given spec) I would go for the FSW (5e-9)

If you haven't recently sent your instrument for calibration according to the manufacturer's recommendations for achieving this spec, this spec is irrelevant.

All else being equal, I might pic the instrument that was calibrated most recently to use as the reference.

But since you implied that frequency accuracy is not critical to your measurement, the whole question is probably moot.

Related Topic