You are here

Jitter

Jitter in a digital system is a random or deterministic timing deviation from the required periodicity of a reference sample clock.

If affecting the clocking of an A-D converter it causes the analogue signal to be sampled at incorrect moments and thus generates amplitude errors in the digital data which cannot (easily) be removed. If the jitter is random this typically results in a slight increase in high-frequency noise and distortion.

If jitter affects the D-A converter it reproduces the accurate digital samples at slightly incorrect times, resulting in the reconstructed analogue waveform having amplitude errors. Again, if the jitter is random this typically results in a slight increase in HF noise.

More problematic is when the jitter has a deterministic character, often having fixed relationship to the system clocks or interfacing arrangements. This form of jitter creates unwanted spectral or tonal elements, and their typically anharmonic nature often makes them audible even when at extremely low amplitudes. 

There are a variety of technical means of reducing and eradicating Jitter and, ideally, in a high quality digital audio system jitter should be in the low picoseconds range. Some digital clocking systems manage to get jitter down inro the femtosecond range!

Many digital interfaces, including S/PDIF and AES3, introduce some jitter (called interface jitter) as an inherent artefact of waveshape distortion caused by cable capacitance or fibre optic dispersion. 

Related articles