20 February 2018 Data fusion for improved camera-based detection of respiration in neonates
Author Affiliations +
Monitoring respiration during neonatal sleep is notoriously difficult due to the nonstationary nature of the signals and the presence of spurious noise. Current approaches rely on the use of adhesive sensors, which can damage the fragile skin of premature infants. Recently, non-contact methods using low-cost RGB cameras have been proposed to acquire this vital sign from (a) motion or (b) photoplethysmographic signals extracted from the video recordings. Recent developments in deep learning have yielded robust methods for subject detection in video data. In the analysis described here, we present a novel technique for combining respiratory information from high-level visual descriptors provided by a multi-task convolutional neural network. Using blind source separation, we find the combination of signals which best suppresses pulse and motion distortions and subsequently use this to extract a respiratory signal. Evaluation results were obtained from recordings on 5 neonatal patients nursed in the Neonatal Intensive Care Unit (NICU) at the John Radcliffe Hospital, Oxford, UK. We compared respiratory rates derived from this fused breathing signal against those measured using the gold standard provided by the attending clinical staff. We show that respiratory rate (RR) be accurately estimated over the entire range of respiratory frequencies.
© (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
João Jorge, João Jorge, Mauricio Villarroel, Mauricio Villarroel, Sitthichok Chaichulee, Sitthichok Chaichulee, Kenny McCormick, Kenny McCormick, Lionel Tarassenko, Lionel Tarassenko, } "Data fusion for improved camera-based detection of respiration in neonates", Proc. SPIE 10501, Optical Diagnostics and Sensing XVIII: Toward Point-of-Care Diagnostics, 1050112 (20 February 2018); doi: 10.1117/12.2290139; https://doi.org/10.1117/12.2290139

Back to Top