What Is Meant by the Stability of an Onboard Clock?
Hardware: Counter/Timers (TIO)
What is meant by the stability of an onboard clock?
The accuracy specification of a clock describes how much deviation there can be between the specified clock frequency and the actual frequency. The stability specification gives a measure of how much the frequency varies over time.
If you run a clock at 20 MHz and the accuracy is 0.01%, you can calculate the absolute minimum and maximum frequency as follows:
Maximum Frequency = 20,000,000 + (20,000,000 * 0.0001) = 20,002,000
Minimum Frequency = 20,000,000 - (20,000,000 * 0.0001) = 19,998,000
Anytime you run the clock, its frequency is somewhere in the range between the minimum and the maximum. Suppose you want to measure the amount of time that occurs between two events. If you use the 20 MHz clock, you know that your measurement is accurate to 0.01%. However, what if you measure the time that occurs between those same two events one year later? Can you compare the two measurements? What you need to know then is the stability of the clock, which is typically specified in percent per unit time (for example, 0.0001%/year) or parts per million per unit time (for example, 100 ppm/year).
In other words, you might turn on the clock and measure the time between two occurrences 100 separate times. If the clock is drifting between the minimum and maximum -- that is, it is not very stable over time -- you cannot compare the separate measurements. If the clock does not drift at all -- that is, it is absolutely stable over time -- you can compare the measurements.
KnowledgeBase 2MGEUR2H: Determining the Accuracy of the Sample Clock or the Accuracy of a Measured Frequency
White Paper: What Clock Error Means to Your Measurement System
Report Date: 11/06/1998
Last Updated: 05/11/2016
Document ID: 1F5DD5KD