Signal fading is a widely observed phenomenon in communication and sensing applications that results in spatially and temporally varying degradations in the received signal power. Specifically, for distributed acoustic sensing (DAS) applications based on phase sensitive Optical Time Domain Reflectometry (phase-OTDR), it is reported that optical signal fading is observed as random dramatic signal power fluctuations, which in turn cause substantial variations in threat detection sensitivity. In this paper, we study optical signal fading in the context of phase-OTDR based DAS from a signal processing perspective and analyze the undesired effects of fading on threat detection performance. Using a detailed phase-OTDR signal model, we analyze the effects of internal system parameters and external vibration source characteristics on optical fading. Based on these analyses, we define the conditions under which optical fading can manifest itself as a dramatic variation in threat detection performance.
Extinction ratio is an inherent limiting factor that has a direct effect on the detection performance of phase-OTDR based distributed acoustics sensing systems. In this work we present a model based analysis of Rayleigh scattering to simulate the effects of extinction ratio on the received signal under varying signal acquisition scenarios and system parameters. These signal acquisition scenarios are constructed to represent typically observed cases such as multiple vibration sources cluttered around the target vibration source to be detected, continuous wave light sources with center frequency drift, varying fiber optic cable lengths and varying ADC bit resolutions. Results show that an insufficient ER can result in high optical noise floor and effectively hide the effects of elaborate system improvement efforts.