A standoff biothreat detection and identification system for scanning large areas was designed, built and tested. The sensor is based on two wavelength ultraviolet light induced fluorescence (UVLIF) measured from a distance. The concept calls for multiple sensor modalities, fused to give the required overall performance. It makes use of multiple cameras, ambient light reflectance, high optical power and wavelength modulated UV LED illumination and synchronized fluorescence detection. A two-step operational mode is described along with results from independent demonstrations for each step. The first step is screening of the scene to recognize the surfaces that maximize the chances of biothreat detection and classification. This step used computer vision and artificial intelligence (semantic segmentation) for automation. The material constituting the surface is identified from color images. A second monochrome camera gives total “fluorescence” images excited with an intensity modulated 368nm UV illuminator. The second demonstration is scanning of slides (the “scene” in this case) from 1.2m away, threat detection (the spots on the slides) and classification via active multispectral fluorescence imaging at two different excitation wavelengths (280 and 368nm) and ambient light reflectance at up to 0.5m2/min. It is primarily the surface characteristics that drive the difficulty of the detection and classification of biological warfare agents (BWAs) on surfaces, along with the amount of BWA present on the surface. This presentation details the results obtained, the lessons learned and the envisioned way ahead.
The quest for real-time high resolution is of prime importance for surveillance applications specially in disaster management and rescue mission. Synthetic aperture radar provides meter-range resolution images in all weather conditions. Often installed on satellites the revisit time can be too long to support real-time operations on the ground.
Synthetic aperture lidar can be lightweight and offers centimeter-range resolution. Onboard airplane or unmanned air vehicle this technology would allow for timelier reconnaissance.
INO has developed a synthetic aperture radar table prototype and further used a real-time optronic processor to fulfill image generation on-demand. The early positive results using both technologies are presented in this paper.
Synthetic aperture radar (SAR) is a tool of prime importance for Earth observation; it provides day and night capabilities in various weather conditions. State-of-the-art satellite SAR systems are a few meters in height and width and achieve resolutions of less than 1 m with revisit times on the order of days. Today’s Earth observation needs demand higher resolution imaging together with timelier data collection within a compact low power consumption payload. Such needs are seen in Earth Observation applications such as disaster management of earthquakes, landslides, forest fires, floods and others. In these applications the availability of timely reliable information is critical to assess the extent of the disaster and to rapidly and safely deploy rescue teams.
Synthetic aperture lidar (SAL) is based on the same basic principles as SAR. Both rely on the acquisition of multiple electromagnetic echoes to emulate a large antenna aperture providing the ability to produce high resolution images. However, in SAL, much shorter optical wavelengths (1.5 μm) are used instead of radar ones (wavelengths around 3 cm). Resolution being related to the wavelength, multiple orders of magnitude of improvement could be theoretically expected. Also, the sources, the detector, and the components are much smaller in optical domain than those for radar. The resulting system can thus be made compact opening the door to deployment onboard small satellites, airborne platforms and unmanned air vehicles. This has a strong impact on the time required to develop, deploy and use a payload. Moreover, in combination with airborne deployment, revisit times can be made much smaller and accessibility to the information can become almost in real-time. Over the last decades, studies from different groups have been done to validate the feasibility of a SAL system for 2D imagery and more recently for 3D static target imagery.
In this paper, an overview of the advantages of this emerging technology will be presented. As well, simulations and laboratory demonstrations of deformation mapping using a tabletop synthetic aperture lidar system operated at 1.5 μm are reviewed. The transmitter and receptor of the fiber-based system are mounted on a translation stage which move at a constant speed relatively to the target (sand) located 25 cm away. The change in the 3D profile of the target is thereafter monitored with sub-millimeter precision using the multiple-pass SAL system. Results obtained with a SAL laboratory prototype are reviewed along with the potential applications for Earth observation.
Long range land surveillance is a critical need in numerous military and civilian security applications, such as threat detection, terrain mapping and disaster prevention. A key technology for land surveillance, synthetic aperture radar (SAR) continues to provide high resolution radar images in all weather conditions from remote distances. State of the art SAR systems based on dual-use satellites are capable of providing ground resolutions of one meter; while their airborne counterparts obtain resolutions of 10 cm. Certain land surveillance applications such as subsidence monitoring, landslide hazard prediction and tactical target tracking could benefit from improved resolution. The ultimate limitation to the achievable resolution of any imaging system is its wavelength. State-of-the-art SAR systems are approaching this limit. The natural extension to improve resolution is to thus decrease the wavelength, i.e. design a synthetic aperture system in a different wavelength regime. One such system offering the potential for vastly improved resolution is Synthetic Aperture Ladar (SAL). This system operates at infrared wavelengths, ten thousand times smaller radar wavelengths. This paper presents a SAL platform based on the INO Master Oscillator with Programmable Amplitude Waveform (MOPAW) laser that has a wavelength sweep of Δλ=1.22 nm, a pulse repetition rate up to 1 kHz and up to 200 μJ per pulse. The results for SAL 2D imagery at a range of 10 m are presented, indicating a reflectance sensibility of 8 %, ground-range and azimuth resolution of 1.7 mm and 0.84 mm respectively.
Long range surveillance of infrastructure is a critical need in numerous security applications, both civilian and military.
Synthetic aperture radar (SAR) continues to provide high resolution radar images in all weather conditions from remote
distances. As well, Interferometric SAR (InSAR) and Differential Interferometric SAR (D-InSAR) have become
powerful tools adding high resolution elevation and change detection measurements. State of the art SAR systems based
on dual-use satellites are capable of providing ground resolutions of one meter; while their airborne counterparts obtain
resolutions of 10 cm. D-InSAR products based on these systems could produce cm-scale vertical resolution image
Deformation monitoring of railways, roads, buildings, cellular antennas, power structures (i.e., power lines, wind
turbines, dams, or nuclear plants) would benefit from improved resolution, both in the ground plane and vertical
direction. The ultimate limitation to the achievable resolution of any imaging system is its wavelength. State-of-the art
SAR systems are approaching this limit. The natural extension to improve resolution is to thus decrease the wavelength,
i.e. design a synthetic aperture system in a different wavelength regime. One such system offering the potential for vastly
improved resolution is Synthetic Aperture Ladar (SAL). This system operates at infrared wavelengths, ten thousand
times smaller than radar wavelengths.
This paper presents a laboratory demonstration of a scaled-down infrastructure deformation monitoring with an
Interferometric Synthetic Aperture Ladar (IFSAL) system operating at 1.5 μm. Results show sub-millimeter precision on
the deformation applied to the target.
A software application, SIST, has been developed for the simulation of the video at the output of a thermal imager. The approach offers a more suitable representation than current identification (ID) range predictors do: the end user can
evaluate the adequacy of a virtual camera as if he was using it in real operating conditions. In particular, the ambiguity in the interpretation of ID range is cancelled. The application also allows for a cost-efficient determination of the optimal design of an imager and of its subsystems without over- or under-specification: the performances are known early in the development cycle, for targets, scene and environmental conditions of interest. The simulated image is also a powerful method for testing processing algorithms. Finally, the display, which can be a severe system limitation, is also fully
considered in the system by the use of real hardware components. The application consists in Matlabtm routines that
simulate the effect of the subsystems atmosphere, optical lens, detector, and image processing algorithms. Calls to
MODTRAN® for the atmosphere modeling and to Zemax for the optical modeling have been implemented. The realism of the simulation depends on the adequacy of the input scene for the application and on the accuracy of the subsystem
parameters. For high accuracy results, measured imager characteristics such as noise can be used with SIST instead of
less accurate models. The ID ranges of potential imagers were assessed for various targets, backgrounds and atmospheric conditions. The optimal specifications for an optical design were determined by varying the Seidel aberration coefficients to find the worst MTF that still respects the desired ID range.
Wavefront sensing is one of the key elements of an Adaptive Optics System. Although Shack-Hartmann WFS are the
most commonly used whether for astronomical or biomedical applications, the high-sensitivity and large dynamic-range
of the Pyramid-WFS (P-WFS) technology is promising and needs to be further investigated for proper justification in
future Extremely Large Telescopes (ELT) applications. At INO, center for applied research in optics and technology
transfer in Quebec City, Canada, we have recently set to develop a Pyramid wavefront sensor (P-WFS), an option for
which no other research group in Canada had any experience. A first version had been built and tested in 2013 in
collaboration with NRC-HIA Victoria. Here we present a second iteration of demonstrator with an extended spectral
range, fast modulation capability and low-noise, fast-acquisition EMCCD sensor. The system has been designed with
compactness and robustness in mind to allow on-sky testing at Mont Mégantic facility, in parallel with a Shack-
Hartmann sensor so as to compare both options.
Long-range land surveillance is a critical need in numerous military and civilian security applications, such as threat detection, terrain mapping and disaster prevention. A key technology for land surveillance, synthetic aperture radar (SAR) continues to provide high resolution radar images in all weather conditions from remote distances. Recently, Interferometric SAR (InSAR) and Differential Interferometric SAR (D-InSAR) have become powerful tools adding high resolution elevation and change detection measurements. State of the art SAR systems based on dual-use satellites are capable of providing ground resolutions of one meter; while their airborne counterparts obtain resolutions of 10 cm. DInSAR products based on these systems can produce cm-scale vertical resolution image products. Certain land surveillance applications such as land subsidence monitoring, landslide hazard prediction and tactical target tracking could benefit from improved resolution. The ultimate limitation to the achievable resolution of any imaging system is its wavelength. State-of-the art SAR systems are approaching this limit. The natural extension to improve resolution is to thus decrease the wavelength, i.e. design a synthetic aperture system in a different wavelength regime. One such system offering the potential for vastly improved resolution is Synthetic Aperture Ladar (SAL). This system operates at infrared wavelengths, ten thousand times smaller radar wavelengths. This paper discusses an initial investigation into a concept for an airborne SAL specifically aiming at land surveillance. The system would operate at 1.55 μm and would integrate an optronic processor on-board to allow for immediate transmission of the high resolution images to the end-user on the ground. Estimates of the size and weight, as well as the resolution and processing time are given.
Synthetic aperture (SA) techniques are currently employed in a variety of imaging modalities, such as radar (SAR) and
ladar (SAL). The advantage of fine resolution provided by these systems far outweighs the disadvantage of having large
amounts of raw data to process to obtain the final image. Digital processors have been the mainstay for synthetic
aperture processing since the 1980’s; however, the original method was optical that is, it employed lenses and other
optical elements. This paper provides a global review of a compact light weight optronic processor that combines optical
and digital techniques for ultra-fast generation of synthetic aperture images. The overall design of the optronic processor
is detailed, including the optical design and data control and handling. As well, its real-time capabilities are
demonstrated. Example ENVISAT/ASAR images generated optronically are also presented and compared with
ENVISAT Level 1 products. As well, the extended capabilities of optronic processing, including wavefront correction
and interferometry are discussed. Finally, a tabletop synthetic aperture ladar system is introduced and SAL images
generated using the exact optronic processor designed for SAR image generation are presented.
Synthetic Aperture Radar (SAR) is a mature technology that overcomes the diffraction limit of an imaging system’s real
aperture by taking advantage of the platform motion to coherently sample multiple sections of an aperture much larger than the physical one. Synthetic Aperture Lidar (SAL) is the extension of SAR to much shorter wavelengths (1.5 μm vs 5 cm). This new technology can offer higher resolution images in day or night time as well as in certain adverse
conditions. It could be a powerful tool for Earth monitoring (ship detection, frontier surveillance, ocean monitoring)
from aircraft, unattended aerial vehicle (UAV) or spatial platforms. A continuous flow of high-resolution images
covering large areas would however produce a large amount of data involving a high cost in term of post-processing
computational time. This paper presents a laboratory demonstration of a SAL system complete with image reconstruction based on optronic processing. This differs from the more traditional digital approach by its real-time processing capability. The SAL system is discussed and images obtained from a non-metallic diffuse target at ranges up to 3m are shown, these images being processed by a real-time optronic SAR processor origiinally designed to reconstruct SAR images from ENVISAT/ASAR data.
Remote sensing or stand-off detection using controlled light sources is a well known and often used technique for
atmospheric and surface spatial mapping. Today, ground based, vehicle-borne and airborne systems are able to cover
large areas with high accuracy and good reliability. This kind of detection based on LiDAR (Light Detection and
Ranging) or active Differential Optical Absorption Spectroscopy (DOAS) technologies, measures optical responses from
controlled illumination of targets. Properties that can be recorded include volume back-scattering, surface reflectivity,
molecular absorption, induced fluorescence and Raman scattering. The various elastic and inelastic backscattering
responses allow the identification or characterization of content of the target volumes or surfaces. INO has developed
instrumentations to measure distance to solid targets and monitor particles suspended in the air or in water in real time.
Our full waveform LiDAR system is designed for use in numerous applications in environmental or process monitoring
such as dust detection systems, aerosol (pesticide) drift monitoring, liquid level sensing or underwater bathymetric
LiDARs. Our gated imaging developments are used as aids in visibility enhancement or in remote sensing spectroscopy.
Furthermore, when coupled with a spectrograph having a large number of channels, the technique becomes active
multispectral/hyperspectral detection or imaging allowing measurement of ultra-violet laser induced fluorescence (UV
LIF), time resolved fluorescence (in the ns to ms range) as well as gated Raman spectroscopy. These latter techniques
make possible the stand-off detection of bio-aerosols, drugs, explosives as well as the identification of mineral content
for geological survey. This paper reviews the latest technology developments in active remote sensing at INO and
presents on-going projects conducted to address future applications in environmental monitoring.
Military targets such as aircrafts and flares do not exhibit unique infrared signatures; their emissions are dominated by
combustion products (mostly water vapor, carbon dioxide and hydrogen chloride) and hot metal greybody emissions.
An algorithm has thus been developed to categorize target signatures based on their emission source components. The
signatures are then partitioned, based on their emission components, into groups of similar emission characteristics.
Using previous trial data, seven unique flare categories were defined. A second algorithm was finally developed to
exploit this signature description and interrogate individual field measurements for target detection and categorization.