Synchronization in telecommunications
This article includes a list of general references, but it lacks sufficient corresponding inline citations. (May 2012) |
Many services running on modern digital telecommunications networks require accurate synchronization for correct operation. For example, if telephone exchanges are not synchronized, then bit slips will occur and degrade performance. Telecommunication networks rely on the use of highly accurate primary reference clocks which are distributed network-wide using synchronization links and synchronization supply units.
Ideally, clocks in a telecommunications network are synchronous, controlled to run at identical rates, or at the same mean rate with a fixed relative phase displacement, within a specified limited range. However, they may be mesochronous in practice. In common usage, mesochronous networks are often described as synchronous.
History
[edit]Synchronization in communications was a hard problem for Alexander Bain in the development of the teleprinter.[1] Thomas Edison achieved synchronization in his stock ticker with a clunky but effective unison mechanism to resynchronize periodically.[2] In the teleprinter world, Howard Krum finally came up with a good decoding mechanism for async signals around 1912.[3]
Synchronization remained a problem well into the electronic era. The final solution to the synchronization problem came with the phase-locked loop. Once available, analog TVs, modems, tape drives, VCRs, and other common devices synchronized consistently.
Components
[edit]Primary reference clock (PRC)
[edit]Modern telecommunications networks use highly accurate primary master clocks that must meet the international standards requirement for long term frequency accuracy better than 1 part in 1011.[4] To get this performance, atomic clocks or GPS disciplined oscillators are normally used.
Synchronization supply unit
[edit]Synchronization supply units (SSU) are used to ensure reliable synchronisation distribution. They have a number of key functions:
- They filter the synchronisation signal they receive to remove the higher frequency phase noise.
- They provide distribution by providing a scalable number of outputs to synchronise other local equipment.
- They provide a capability to carry on producing a high quality output even when their input reference is lost, this is referred to as holdover mode.
Quality metrics
[edit]In telecoms networks two key parameters are used for measurement of synchronisation performance. These parameters are defined by the International Telecommunication Union in its recommendation G.811, by European Telecommunications Standards Institute in its standard EN 300 462-1-1, by the ANSI Synchronization Interface Standard T1.101 defines profiles for clock accuracy at each stratum level, and by Telecordia/Bellcore standards GR-253[5] and GR-1244.[6]
- Maximum time interval error (MTIE) is a measure of the worst case phase variation of a signal with respect to a perfect signal over a given period of time.
- Time deviation (TDEV) is a statistical analysis of the phase stability of a signal over a given period of time.
See also
[edit]- PDH, SDH and SONET
- Caesium standard
- Synchronous network
- Isochronous signal
- Mesochronous network
- Plesiochronous system
- Asynchronous communication
- Phase-locked loop
References
[edit]- ^ Steven Roberts. "Distant Writing – Bain".
- ^ "Stock Ticker", Thomas A. Edison Papers, Rutgers, retrieved 2024-09-20
- ^ US Patent 1286351, issued December 1918
- ^ ITU-T Rec. G.811 (09/97) Timing characteristics of primary reference clock | ITU-T
- ^ GR-253 - Synchronous Optical Network (SONET) | Telcordia
- ^ GR-1244 - Clocks for the Synchronized Network: | Telcordia
- This article incorporates public domain material from Federal Standard 1037C. General Services Administration. Archived from the original on 2022-01-22. (in support of MIL-STD-188).
- Bregni, Stefano (2002). Synchronization of Digital Telecommunications Networks. Wiley. ISBN 0-471-61550-1.