As previously demonstrated, timing noise/jitter is a very complicated phenomenon, and in order to understand how jitter can impact a system it needs to be measured and quantified. Historically, clock specifications have defined jitter from two different perspectives: measurements in the time domain, and measurements in the frequency domain.

**Time Domain Jitter Measurements**

There are three types of jitter that are measured in the time domain: period jitter, cycle-to-cycle jitter, and time interval error (TIE) jitter.

- The period jitter represents the maximum deviation of an oscillator’s period in a waveform over a specified number of clock cycles. Given a number of clock cycles (typically in the 10,000+ range) each period is measured in order to calculate the mean value, standard deviation and peak-peak value. As this measures the clock period deviations over time, there is no reference to an ideal value. Any difference between the mean and expected value can be looked at as an offset, and easily designed around.
- The cycle-to-cycle jitter represents how much the oscillator’s period changes between any two adjacent cycles. Much like period jitter, cycle-to-cycle jitter is not measured in reference to an ideal value. Cycle-to-cycle jitter for an oscillator is reported as a peak value which defines the maximum deviation between the rising edges of any two consecutive clocks. For example, if one period is -20pS from the average, and another period is 15pS from the average, the maximum cycle-to-cycle jitter is 20pS. There are some measuring devices that calculate the cycle-to-cycle jitter by applying a first order difference operation to the period jitter.
- TIE jitter measures how far each edge of the clock varies from its ideal position. Unlike period and cycle-to-cycle jitter, TIE jitter measurements require the tester to know where the ideal edges should be, and for this reason it is difficult to observe TIE in real time with an oscilloscope. In most cases TIE jitter measurements require the tester to capture and post-process the data. The TIE may also be obtained by integrating the period jitter, after first subtracting the nominal (ideal) clock period from each measured period. TIE is important because it shows the cumulative effect that even a small amount of period jitter can have over time.

**Phase Noise Jitter Measurements**

Phase noise is often considered an important measurement of spectral purity, which is the inherent stability of a timing signal. Let’s take s sinusoidal oscillator that outputs a sine wave with a frequency Fc. An ideal oscillator will have perfect spectral purity, with all of its power concentrated at Fc. As jitter causes variations in frequency, this results in power being spread out to adjacent frequencies, creating sidebands. By measuring the amount of power in the sidebands relative to the carrier, we can derive a jitter value.

Phase jitter can be calculated by measuring the phase spectral density of the clock’s signal and integrating over a specific frequency band of interest. The area under the spectrum plot represents the power level of the phase modulating (jitter producing) noise, and the power of the noise is proportional to the RMS phase jitter squared. In most cases the upper frequency can be specified up to the Nyquist frequency of the system. An application may only be interested in a range of the phase noise plot and typically will specify that range or apply a mask to identify the area of interest and maximum dBc values permitted. This mask is often called a Jitter Mask.

In order to specify a Jitter Mask, a systems transfer function must be analyzed. The jitter mask will specify the clock performance in how it will actually affect the system, with the goal of avoiding over-specification. In today’s engineering world, many jitter masks have been predetermined by industry protocols, the most common being the requirement for SONET (12kHz to 20MHz) and related telecom standards for optical carrier network OC48. This jitter mask has become an unofficial industry standard, regardless of if the clock will be used in a SONET application or not.

This can be problematic, as it is not always the case that the 12 kHz to 20 MHz jitter bandwidth can be measured, or even make sense, if the carrier or clock frequency is too low. Practical systems have digital sampling, or equivalent mixing and filtering limitations, which require the minimum carrier frequency to be roughly 2X or more than the max offset frequency. Even if this was not the case, one cannot measure a clock with a carrier frequency < 20 MHz directly, with offsets out to 20 MHz. It is for this reason that a designer (both for the oscillator and the system) must have a firm understanding of what exactly they are asking for when it comes to a phase jitter spec.

We have now defined several methods in assigning a value to jitter, which leads us to our next question: we know WHAT we want to measure, but now HOW do we measure it?