17 November 2017 Synthetic aperture lidar as a future tool for earth observation
Author Affiliations +
Proceedings Volume 10563, International Conference on Space Optics — ICSO 2014; 105633V (2017) https://doi.org/10.1117/12.2304256
Event: International Conference on Space Optics — ICSO 2014, 2014, Tenerife, Canary Islands, Spain
Abstract
Synthetic aperture radar (SAR) is a tool of prime importance for Earth observation; it provides day and night capabilities in various weather conditions. State-of-the-art satellite SAR systems are a few meters in height and width and achieve resolutions of less than 1 m with revisit times on the order of days. Today’s Earth observation needs demand higher resolution imaging together with timelier data collection within a compact low power consumption payload. Such needs are seen in Earth Observation applications such as disaster management of earthquakes, landslides, forest fires, floods and others. In these applications the availability of timely reliable information is critical to assess the extent of the disaster and to rapidly and safely deploy rescue teams.

Synthetic aperture lidar (SAL) is based on the same basic principles as SAR. Both rely on the acquisition of multiple electromagnetic echoes to emulate a large antenna aperture providing the ability to produce high resolution images. However, in SAL, much shorter optical wavelengths (1.5 μm) are used instead of radar ones (wavelengths around 3 cm). Resolution being related to the wavelength, multiple orders of magnitude of improvement could be theoretically expected. Also, the sources, the detector, and the components are much smaller in optical domain than those for radar. The resulting system can thus be made compact opening the door to deployment onboard small satellites, airborne platforms and unmanned air vehicles. This has a strong impact on the time required to develop, deploy and use a payload. Moreover, in combination with airborne deployment, revisit times can be made much smaller and accessibility to the information can become almost in real-time. Over the last decades, studies from different groups have been done to validate the feasibility of a SAL system for 2D imagery and more recently for 3D static target imagery.

In this paper, an overview of the advantages of this emerging technology will be presented. As well, simulations and laboratory demonstrations of deformation mapping using a tabletop synthetic aperture lidar system operated at 1.5 μm are reviewed. The transmitter and receptor of the fiber-based system are mounted on a translation stage which move at a constant speed relatively to the target (sand) located 25 cm away. The change in the 3D profile of the target is thereafter monitored with sub-millimeter precision using the multiple-pass SAL system. Results obtained with a SAL laboratory prototype are reviewed along with the potential applications for Earth observation.
Turbide, Marchese, Terroux, and Bergeron: SYNTHETIC APERTURE LIDAR AS A FUTURE TOOL FOR EARTH OBSERVATION

I.

INTRODUCTION

Synthetic aperture radar (SAR) is a tool of prime importance for Earth observation; it provides day and night capabilities in various weather conditions. State-of-the-art satellite SAR systems are a few meters in height and width and achieve resolutions of less than 1 m with revisit times on the order of days. Today’s Earth observation needs demand higher resolution imaging together with timelier data collection within a compact low power consumption payload. Such needs are seen in Earth Observation applications such as disaster management of earthquakes, landslides, forest fires, floods and others. In these applications the availability of timely reliable information is critical to assess the extent of the disaster and to rapidly and safely deploy rescue teams.

Synthetic aperture lidar (SAL) is based on the same basic principles as SAR. Both rely on the acquisition of multiple electromagnetic echoes to emulate a large antenna aperture providing the ability to produce high resolution images. However, in SAL, much shorter optical wavelengths (1.5 μm) are used instead of radar ones (wavelengths around 3 cm). Resolution being related to the wavelength, multiple orders of magnitude of improvement could be theoretically expected. Also, the sources, the detector, and the components are much smaller in optical domain than those for radar. The resulting system can thus be made compact opening the door to deployment onboard small satellites, airborne platforms and unmanned air vehicles. This has a strong impact on the time required to develop, deploy and use a payload. Moreover, in combination with airborne deployment, revisit times can be made much smaller and accessibility to the information can become almost in real-time. Over the last decades, studies from different groups have been done to validate the feasibility of a SAL system for 2D imagery and more recently for 3D static target imagery.

In this paper, an overview of the advantages of this emerging technology will be presented. As well, simulations and laboratory demonstrations of deformation mapping using a tabletop synthetic aperture lidar system operated at 1.5 µm are reviewed. The transmitter and receptor of the fiber-based system are mounted on a translation stage which move at a constant speed relatively to the target (sand) located 25 cm away. The change in the 3D profile of the target is thereafter monitored with sub-millimeter precision using the multiple-pass SAL system. Results obtained with a SAL laboratory prototype are reviewed along with the potential applications for Earth observation.

II.

SYNTHETIC APERTURE SYSTEMS

In remote sensing, conventional imagers must often sacrifice ground resolution for system compactness since in these systems the resolution is limited by the aperture size. A technique to obviate the diffraction limitation of an imaging system’s real aperture is known as Synthetic Aperture (SA) and has been successfully employed at radio frequencies on both space-borne and airborne platforms for many years. These active imaging systems (Synthetic Aperture Radars or SARs) take advantage of the platform motion to coherently sample multiple sections emulating an aperture much larger than the physical one. The backscattered data returns are then coherently reconstructed to produce the final high resolution SAR image. Typical SAR systems operate at centimeter wavelengths, have antenna sizes of close to ten meters and produce images with ground resolutions less than ten meters. Synthetic Aperture Lidar (SAL) systems [1]-[5] that would operate at wavelengths one thousand times smaller than SAR systems thus could potentially offer images with ground resolutions of tens of millimeters within a compact envelope. These kind of resolutions would provide critical and precise information for disaster management teams. Moreover SAL 3D is also possible for terrain mapping. In this paper the data acquired by a synthetic aperture lidar prototype are further processed with an optronic processor to illustrate a future real-time tool for Earth observation.

Fig. 1.

Illustration of a SAR image acquisition system

00047_PSISDG10563_105633V_page_3_1.jpg

Fig. 2.

Illustration of an early optronic SAR processing system

00047_PSISDG10563_105633V_page_3_2.jpg

Typical SAR imaging systems consist of two distinct operations employing different technologies. SAR raw data acquisition, illustrated in Figure 1 is performed with a radar antenna that acts as both the transmitter and receiver. SAR image reconstruction is either digital where the raw date are run though mathematical algorithms such as Range-Doppler or Chirp-Scaling on computers or optronic (based on digital holography [6]-[8] and illustrated in Figure 2) where the raw data are coherently illuminated and lenses are inserted into the beam’s path focusing the raw data to form the image that is in turn captured by a digital camera or on film.

Fig. 3.

Photo of the optronic processor (left) and corresponding schematic (right)

00047_PSISDG10563_105633V_page_3_3.jpg

A SAL system is basically the equivalent of a SAR system where the radar wavelength is replaced with an infrared wavelength, a thousand time smaller, yielding a more compact system and eventually much higher resolution.

Figure 3 presents an image (left) and schematic drawing (right) of a prototype optronic SAR processor designed to reconstruct SAR images from specifically from ENVISAT/ASAR data [9]-[10]. In this implementation, the SAR raw data is fed into the system through two spatial light modulators, one for the amplitude component and one for the phase component, illuminated by a coherent laser beam. The optronic processor is composed of two main sections. A SAR relay maps the amplitude information over the phase information. Once combined, both components are propagated simultaneously into the optical system. The light propagation combined to the lenses induced modification refocused the raw data to generate a SAR image. Since the light propagation performs the generation of the image, the computation is made at the speed of light. The actual processing capabilities of the processor is defined by the refreshing rate of the SLMs, their dimensions, in this case 1920x1080 pixels, and the SAR parameters such as azimuth compression ratio and range compression ratio.

III.

SYNTHETIC APERTURE LIDAR SENSING AND PROCESSING

A.

Fiber-optic SAL image acquisition system

A first laboratory SAL image acquisition system set-up was designed and built. It is illustrated in Figure 4 and shown in Figure 5. The purpose of the prototype was to verify the feasibility of the concept of synthetic aperture lidar. The system is based on an eye-safe tunable laser operating at 1.5 µm with an average output power of 8 mW. The pulse duration was set to 0.6sec and the pulsed bandwidth to 1 THz. The optical fiber used was a single mode SMF-28.

The chirped laser beam is sent onto a target, as in SAR systems, and the echo is captured by the same channel. The distance between the lens and the target is 30 cm. In parallel a part of the illumination beam is used for reference and is sent to a second path and reflected through the use of a corner reflector back to the source. This reference plays a role similar to a local oscillator found in SAR systems. The beam collected from the target is combined to the reference beam and the interference pattern is recorded. This becomes the raw data that will be further processed with the SAR, now SAL, optronic processor. Typically the optical length of the reference and the target paths will be balanced i.e. mostly similar.

Fig. 4.

Schematic of the SAL laboratory set-up

00047_PSISDG10563_105633V_page_4_1.jpg

The target, shown on the left part of Figure 6 was glued on wood board. The SAL optical head was mounted on a translation stage controlled by an executable Labview program. The beam width is much larger than the target since using synthetic aperture techniques, features much smaller than the illumination spot-size can be resolved.

Fig. 5.

Photo of the SAL laboratory set-up

00047_PSISDG10563_105633V_page_5_1.jpg

Fig. 6.

Illustration of the target (left) and the SAL image (right)

00047_PSISDG10563_105633V_page_5_2.jpg

B.

Optronic processing of SAL data

SAL sensors will provide very high resolution but will also generate huge amounts of data. To cope with this large data generation rate, an optronic processor was used. The optronic processor exhibits real-time processing capabilities and will be a key element for any real-time applications foreseen.

The data acquired by the SAL set-up is processed using the optronic SAR processer designed for ENVISAT/ASAR data. No changes were made to the optronic processor; only a straightforward scaling was performed to change raw data focal lengths.

The SAL raw data is thus input on the SLMs, propagated to the processor, and the reconstruction is captured on the camera located at the exit of the optronic processor. Figure 7 shows a comparison between a theoretical SAL image (simulated image acquisition and processing, top) and the real processed SAL image (data taken with the fiber-based set-up and processed with the optronic processor, bottom). The optronically processed real image shows good agreement with the theoretical image. The three letters, I, N and O are clearly seen as well as details of the rectroflective structure (diagonal lines).

Fig. 7.

Theoretical SAL image (top) and real all-optical SAL image (bottom)

00047_PSISDG10563_105633V_page_6_1.jpg

It can also be observed that the diffusing black diffusing wood board is imaged. This is an excellent result since diffusing materials exhibit much lower reflective properties. This is also the very first step toward the use of SAL in a wide range of applications such as disaster management of earthquakes, landslides, forest fires, floods and others. Speckle can also be observed, as expected from a coherent imaging system. Quantitative analysis of the image further shows that the resolution obtained is better than 300 μm in range and 80 μm in azimuth which compare nicely to the theoretical values of 255 μm and 57 μm respectively.

IV

SAL FOR 3D DEFORMATION

Another application for SAL is 3D mapping, some laboratory tests have performed in the lab introducing small perturbations of the sand surface test with a nut. The results show that SAL can detects changes in the order of hundreds of microns.

Fig. 8.

Sand before deformation (top) and SAL 3D image (bottom)

00047_PSISDG10563_105633V_page_6_2.jpg

V

SAL FOR EARTH OBSERVATIONS

Various day and night earth observations applications can be envisioned with SAL, for global infrastructure monitoring, land slide monitoring, urban expansion etc. Very high resolution level provides unique capabilities. SAL imaging from an airborne platform is possible: A flight prototype with a 1 m beam footprint was demonstrate in [11]. However, this small ground coverage is not practical for most infrastructure monitoring applications. To address earth observation it is proposed to stretch the beam in the ground direction to enhance ground coverage with a limited source power. A first order evaluation study, based on the minimum requirement of one collected photon from each ground resolution element within each pulse duration, was done (see Table 1). The values for laser requirement and acquisition appears quite realistic according to the state of the art lasers and analog-to-digital converters.

Table 1:

different set of parameters fulfilling the minimum requirement of one collected photon from each ground resolution element within each pulse duration. Ground reflectance=0.25, aircraft velocity=55 m/s, λ,=1.55 μm and the diameter of the reception optics is 20 cm.

ConfigQDA [m]DGr [m]Flight altitude [m]Peak laser power [W]Mean laser power [W]Pulse repetition frequency [KHz]Pulse duration [ns]Pulse bandwidth [GHz]Samp. rate [GS/s]Groundrange res. [cm]Az res [cm]
170110075050010682966340.2
2100114075050015684446340.2
3302851500500206859231.2100.2
414012001500500103459231.2100.3

From these configuration it can be seen a range resolution of 4 cm and azimuth resolution of 0.2 cm is possible for an airborne platform. This confirms the outstanding imaging capabilities of a SAL sensor for Earth observation. Q: stretch factor, DA: azimuth beam diameter, DG: ground range beam diameter,

VI.

CONCLUSIONS

A synthetic aperture lidar laboratory prototype was built and tested. The raw data generated by the SAL head were further processed optronically and compared with digitally processed images. The results obtained showed good agreement between the experimental and theoretical resolutions. Images were obtained from both diffusive and retro-reflective materials opening the door to a wide range of surveillance application. The SAL system was used to confirm the possibility to detect 3D changes on sand surface. Furthermore, realistic scenarios of airborne SAL are proposed showing the strong potential for Earth Observation.

VII

ACKNOWLEDGEMENTS

INO would like to acknowledge the contribution of ESA ESTEC for their participation in the project through the loan of an Optronic SAR processor for the tests.

VIII

VIII

REFERENCES

[1] 

T. G. Kyle, “High resolution laser imaging system,” Applied Optics, 28(13) pp. 2651–2656, 1989.Google Scholar

[2] 

M. Bashkansky, R. L. Lucke, E. Funk, L. J. Rickard, J. Reintjes, “Two-dimensional synthetic aperture imaging in the optical domain,” Optics Letters, 27(22), pp. 1983–1985, 2002.Google Scholar

[3] 

R. L. Lucke, L. J. Richard, “Photon-limited synthetic-aperture imaging for planet surface studies,” Applied Optics, 41(24), pp. 5084–5095, 2002.Google Scholar

[4] 

S. M. Beck, J. R. Buck, W. F. Buell, R. Dickinson, D. Kozlowski, N. J. Marechal, T. J. Wright, “Synthetic-aperture imaging laser radar: laboratory demonstration and signal processing,” Applied Optics, 44(35), 7621–7629, 2005.Google Scholar

[5] 

W. Buell, N. Marechal, J. Buck, R. Dickinson, D. Kozlowski, T. Wright, S. Beck, “Demonstrations of Synthetic Aperture Imaging Ladar,” Proc. SPIE 5791, pp. 152–166, 2005.Google Scholar

[6] 

Cutrona, L. J., Leith, E. N., Porcello, L. J., Vivian, W. E., “On the Application of Coherent Optical Processing Techniques to Synthetic Aperture Radar”, Proceedings of the IEEE, 54 (8), pp. 1026–1032, August 1966.Google Scholar

[7] 

J.C. Curlander and R.N. McDonough, Synthetic Aperture Radar: systems and Signal Processing, John Wiley & Sons, Inc., New York, 1991.Google Scholar

[8] 

I.G. Cumming and F.H. Wong, Digital processing of synthetic aperture radar data: algorithms & implementation, Artec House, Boston, 2005.Google Scholar

[9] 

L. Marchese, M. Doucet, B. Harnish, M. Suess, P. Bourqui, M. Legros, N. Desnoyers, L. Guillot, L. Mercier, M. Savard, A. Martel, F. Chateauneuf, A Bergeron, “Ultra-Rapid optronic Processor for Instantaneous ENVISAT/ASAR Scene Observation,” IGARSS 2010, IEEE, Honolulu, HI, pp. 685–687, July 2010.Google Scholar

[10] 

A. Bergeron, M. Doucet, B. Harnish, M. Suess, L. Marchese, P. Bourqui, N. Desnoyers, M. Legros, L. Guillot, L. Mercier, F. Chateauneuf, “Satellite In-Board Real-Time SAR Processor Prototype,” ICSO 2010, IEEE, Rhodes, Greece, October 2010.Google Scholar

[11] 

B. W. Krause, J. Buck, C. Ryan, D. Hwang, P. Kondratko, A. Malm, A. Gleason and S. Ashby, Synthetic aperture ladar flight demonstration, Conference paper, CLEO: Applications and Technology, Baltimore, Maryland (2011).Google Scholar

© (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Simon Turbide, Simon Turbide, Linda Marchese, Linda Marchese, Marc Terroux, Marc Terroux, Alain Bergeron, Alain Bergeron, } "Synthetic aperture lidar as a future tool for earth observation", Proc. SPIE 10563, International Conference on Space Optics — ICSO 2014, 105633V (17 November 2017); doi: 10.1117/12.2304256; https://doi.org/10.1117/12.2304256
PROCEEDINGS
8 PAGES


SHARE
RELATED CONTENT


Back to Top