Free-space optical communication (FSOC) holds unmatched potential for high bandwidth and secure communications while minimizing size, weight, and power (SWAP). However, the effects of atmospheric scintillations on high bandwidth signals limits data link performance by degrading OSNR (Optical signal-to-noise ratio) and Q-factor. A critical component due to which a communication signal quality deteriorates is timing jitter. Jitter may be due to timing of the data signal or it may be due to the amplitude variations in the data bit stream as it propagates through free-space. As the data bandwidth increases, these effects become more significant. A small-time deviation in a lower data rate signal which would be tolerable or be above a receiver sensitivity, turns into an intolerable signal at higher data rates as jitter increases. The total jitter (TJ) can be further broken down to deterministic jitter (DJ) and random jitter (RJ). These may help understand signal behavior and the root cause of degradation in a FSOC or any data communication link. Thus, for a system to achieve desired BER (bit-error-rate and bit-error-ratio), an in-depth analysis of jitter by investigating each of the subclass of both timing jitters, DJ and RJ, would be extremely helpful and enhance the robustness of the link. In this paper, we report in-depth jitter analysis from a FSOC data link at 10 Gbps propagating at 1550 nm.
|