Presentation
27 January 2017 Towards real-time tracking of hidden objects in real-life scenarios (Conference Presentation)
Susan Chan, Ryan E. Warburton, Genevieve Gariepy, Yoann Altmann, Steve McLaughlin, Jonathan Leach, Daniele Faccio
Author Affiliations +
Proceedings Volume 9992, Emerging Imaging and Sensing Technologies; 99920P (2017) https://doi.org/10.1117/12.2241094
Event: SPIE Security + Defence, 2016, Edinburgh, United Kingdom
Abstract
Advances in LIDAR-based methods have enabled the detection and reconstruction of images of static objects hidden from the direct line-of-sight [1, 2]. One of the drawbacks to the technology used in these demonstrations is the requirement for long acquisition times. More recently, Gariepy et al. have shown that it is possible to detect and track a moving hidden object, albeit with no information of the object’s form [3]. Applications of this include, but are not limited to, search and rescue, and hazard detection. We present a real-time tracking system that enables the detection of moving objects that are outside the direct line-of-sight. Our active imaging system is a single-pixel variant of the technology reported by Gariepy et al. It replaces the single-photon avalanche diode (SPAD) camera of 1024 pixels with a number of SPAD detectors to detect light back-scattered from the hidden object. The flexibility of the single-pixel detectors provides an increased field of view, allowing us to detect and simultaneously track with better precision with respect to a SPAD array. The use of single-pixel detectors also has the advantage of a high detection efficiency. We perform two proof-of-concept experiments using three pixels and a single pulsed laser to interrogate a “room” for a hidden object. In the first experiment, we demonstrate that we can accurately locate the position of a hidden object. In the second experiment, we use the same system and demonstrate that we can accurately track the motion of a hidden object in real time. The “room” is a purpose-built box measuring 102×102×77 cm. Optical access is provided by a 28×12 cm window. The target object is a 15×15 cm textured viewing screen that we move along a designated ground track outside the line-of- sight of our system. In our experiments, we send a train of light pulses through the window to the back of the room. The pulses scatter off the wall as a spherical wavefront that propagates in all directions. Some of this light reaches our hidden object and is scattered back again towards the rear wall where we image our three SPAD pixels. The SPAD detectors are capable of picosecond temporal resolutions. Our time-correlated single-photon counting system measures the photon arrival times (64 ps resolution) for the signal returning to each detector. A histogram is built up in one second of acquisition time over 80 million pulses. We use this temporal information in our target position retrieval of the hidden object. We place the object at 11 positions in turn in a seven minute experiment, and localise its position. We then perform real-time tracking and move the object around the hidden scene for approximately one minute, processing the target position retrieval every 1.5 s.
Conference Presentation
© (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Susan Chan, Ryan E. Warburton, Genevieve Gariepy, Yoann Altmann, Steve McLaughlin, Jonathan Leach, and Daniele Faccio "Towards real-time tracking of hidden objects in real-life scenarios (Conference Presentation)", Proc. SPIE 9992, Emerging Imaging and Sensing Technologies, 99920P (27 January 2017); https://doi.org/10.1117/12.2241094
Advertisement
Advertisement
KEYWORDS
Sensors

Data hiding

Picosecond phenomena

Avalanche photodiodes

Cameras

Imaging systems

Pulsed laser operation

Back to Top