Open Access
30 January 2019 Preventing image information loss of imaging sensors in case of laser dazzle
Author Affiliations +
Abstract
We present an optical concept for imaging sensor systems, designed to considerably reduce the sensor’s image information loss in cases of laser dazzle, based on the principle of complementary bands. For this purpose, the sensor system’s spectral range is split in several (at least two) spectral channels, where each channel possesses its own imaging sensor. This long-known principle is applied, for example, in high-quality three-sensor color cameras. However, in such camera systems, the spectral separation between the different spectral bands is too poor to prevent complete sensor saturation when illuminated with intense laser radiation. We increased the channel separation by orders of magnitude by implementing advanced optical elements. Thus, monochromatic radiation of a dazzle laser mainly influences the dedicated transmitting spectral channel. The other (out-of-band) spectral channels are not or—depending on the laser power—only hardly affected. We present our system design as well as a performance evaluation of the sensor concerning laser dazzle.

1.

Introduction

Lasers are widespread in modern life and are used for many purposes. But this also leads to their abuse and it seems judicious to protect oneself. In the realm of laser protection, research on dazzling of the human eye and of sensors using high-power laser pointers is currently of particular interest.1 A lot of work was done on the investigation of laser eye dazzle,24 including the invention of the concepts of “maximum dazzle exposure” and “nominal ocular dazzle distance.”57 Moreover, tests on the degradation of the human performance in laser dazzle situations were performed.8,9 Laser dazzling of sensors was intensively studied, experimentally and theoretically, by various groups.1016

Laser dazzle protection seems to be a simple issue, but it faces the challenge that lasers are available with any wavelength in the visible spectral range. Classical laser protection measures, such as absorption or interference filters, used in laser eye protection goggles cannot provide protection for all wavelengths. Current research concepts on wavelength-independent or tunable laser protection measures include liquid crystal Lyot filters,17 augmented reality headsets,18 and the use of pupil-plane phase elements.19,20 In the past years, our research on laser dazzle protection focused on the development of an active laser light suppression concept based on the use of a digital micromirror device in combination with wavelength multiplexing.2126 For this sensor system, we were able to realize a mean attenuation of laser light up to 45.5 dB in the visible spectral range.21 However, by using quantitative methods to assess the sensor’s protection performance,2730 we found out that this protection concept was good mainly at lower dazzle levels. At larger dazzle levels, i.e., when nearly the complete sensor’s field of view was dazzled, the protection performance strongly degraded due to contrast loss and color distortions.

Some time ago, we became aware of a publication by Svensson et al.,31 who described various methods for wavelength-independent laser protection measures. Among them, the concept of “complementary wavelength bands” was mentioned. This concept drew our attention and is described in more detail in Sec. 2. In addition to Svensson’s publication, which mentions this approach, we could not find any other information, whether it was already realized and investigated for laser dazzle protection or not. Therefore, we built up a laboratory demonstrator implementing this concept (see Secs. 3 and 4) and assessed its performance concerning laser dazzle protection (see Sec. 5).32

2.

Operating Principle

The principle of complementary bands is quite simple. By means of dichroic optical elements, light entering the sensor is spectrally split into spatially separated channels containing dedicated imaging sensors. Subsequently, the different spectral images are fused to reproduce the scene. Figure 1 illustrates the working principles of a standard camera using a single-color imaging sensor [Fig. 1(a)] and of a two-channel system based on complementary bands [Fig. 1(b)].

Fig. 1

(a) Operating principle of a standard camera. The camera takes an image formed by the complete visible light spectrum of the scene. (b) Operating principle of a two-channel complementary bands system: a dichroic optical element splits the visible light spectrum in two channels. Two cameras take images of the separated spectral bands. The scene’s spectral information is recovered in a fusion image.

OE_58_1_013109_f001.png

The principle of complementary bands is applied, for example, in three-sensor color cameras (3-CCD or 3-CMOS cameras), where three channels are used to capture dedicated images of the red, green, and blue spectral bands to generate high-resolution color images. When monochromatic laser light enters such an optical system, it is directed into the channel whose spectral transmittance corresponds to the laser wavelength. The color separation of conventional three-sensor cameras is typically in the order of about 3 magnitudes. Thus, high-power laser light might also dazzle the other color channels (not associated with the laser wavelength) resulting in complete sensor failure. For comparison, we measured the spectral separation of the color channels of a commercial 3-CMOS camera (JAI AP-1600T-PGE); the results are shown in Fig. 2.

Fig. 2

Spectral separation of a 3-CMOS camera JAI AP-1600T-PGE.

OE_58_1_013109_f002.png

To measure the spectral channel separation, we used a broadband light source (Thorlabs SLS201/M) and a set of narrow bandpass filters (FWHM 10  nm). This covered a wavelength range of 470 to 710  nm. The light was directed to the camera and the signals (maximum pixel value) of the three color channels were measured (Sred, Sgreen, and Sblue). The plot of Fig. 2 does not show the camera’s spectral responsivity but the particular ratio of the signals on a logarithmic scale. For example, the spectral separation of the green and red channels is calculated by SSgreen,red=|10·log10(Sgreen/Sred)|. Additionally, we also plotted the envelope of all three curves, which shows the maximum spectral separation of the camera system as a function of wavelength. This is the most important curve in the sense of vulnerability to laser dazzle. From the envelope curve, we can see that the spectral separation of this particular 3-CMOS camera is 30 to 40 dB except for wavelengths above 690  nm.

Although the spectral separation of this 3-CMOS camera is quite good, the spectral separation of the different channels needs to be increased to avoid undesirable dazzling of the all three imaging sensors when encountered by high-power laser sources. When the spectral separation of the bands is chosen appropriately, (monochromatic) laser light will only jam the corresponding spectral transmission channel and, at the same time, the fusion image will still deliver scene information generated from the complementary bands.

Generally, the simplest way of generating a fusion image is performed by calculating the average of all channels. A drawback of this method occurs for laser dazzle since the image contrast in the dazzled part of the image would be reduced. Therefore, we implemented an improved image fusion algorithm to maintain the image contrast as good as possible. Our approach is to use an image analysis algorithm that detects dazzled spots in the images of the single channels and neglects these overexposed pixels in the calculation of the average. Thus, in the fusion image, the overexposed spots are not taken into account and the result is a dazzle-free image.

The example shown in Fig. 1(b) represents only one specific layout, namely a two-channel complementary bands system; classical three-sensor cameras have three channels. An n-channel complementary bands system could only be dazzled completely, when illuminated with n different laser wavelengths at the same time, which fit to the spectral passbands of the system’s channels. This means that a larger number of channels makes the system less vulnerable to monochromatic laser dazzle but on cost of complexity and larger dimensions. For our laboratory demonstrator, we decided to implement a system with three channels.

The approach of complementary bands does not represent a protection measure in the classical way since a dazzle laser is still able to jam or damage one of the imaging sensors. However, since the image quality will only be reduced, the output of the system will still allow an observer to fulfill his or her task.

3.

System Design

We describe a laboratory demonstrator based on the concept of complementary bands with three optical channels (i.e., three spectral bands) using standard opto-mechanical and optical elements. Figure 3(a) shows a photograph of the demonstrator (camera lens dismounted) and Fig. 3(b) shows a sketch of the optical layout. The incoming light first passes a Keplerian telescope formed by a camera lens (f=35  mm, f/#=2.0) and the internal lens L1 (f=28  mm, f/#=2.0). Subsequently, the light is spectrally split into three different optical channels by means of two dichroic beam splitters (DBS500 and DBS600): blue channel (400 to 500 nm), green channel (500 to 600 nm), and red channel (600 to 700 nm). In addition to these dichroic beam splitters, additional use of appropriate shortpass (SPxxx) and longpass (LPxxx) filters in each optical channel ensures that out-of-band laser radiation is effectively attenuated. Finally, the light in each channel is focused by a second internal lens L2 (f=25  mm, f/#=1.4) on a dedicated imaging sensor (754×480  pixel, 6-μm pitch).

Fig. 3

(a) Photographs of the three-band sensor (camera lens dismounted) and (b) schematic diagram of the optical layout.

OE_58_1_013109_f003.png

Note that, in contrast to the schematic diagram of Fig. 3(b) representing the functional principle of the sensor system, the shortpass filter SP700 is placed in front of lens L1 in our system. Thus, SP700 acts additionally as an IR cut-off filter for the whole system. In addition, the longpass filter LP400 (drawn in faded color in the schematic diagram) serves only to illustrate the principle. We recognized that the longpass LP400 is not necessary due to the transmittance characteristics of the other dichroic elements.

Table 1 lists the optical elements used to build up the three-band sensor and the transmission bands for the dichroic/filter elements. To get spectrally well-separated channels, the edges of the shortpass and longpass filters must be very steep and their edge-wavelength cannot be chosen to be exactly 500 or 600 nm. Otherwise, the optical density at the crossover point of the respective filter curves would not be high enough to attenuate laser radiation effectively at these wavelengths. The filters were chosen in such a way that out-of-band laser radiation is attenuated by 6 orders of magnitude.

Table 1

Optical elements used to implement the three-band sensor.

DenotationOptical elementTransmission band
External camera lensSchneider-Kreuznach Apo-Xenoplan 2.0/35-2001
Internal lens L1Schneider-Kreuznach Xenoplan 2.0/28
Dichroic beam splitter DBS500Semrock FF484-FDi01-25x36492 to 950 nm (Tavg>93%)
Dichroic beam splitter DBS600Semrock FF580-FDi01-25x36591 to 950 nm (Tavg>93%)
Shortpass filter SP500Semrock FF01-492/SP-25400 to 480 nm (Tavg>90%)
Longpass filter LP500Semrock BLP01-514R-25529 to 900 nm (Tavg>93%)
Shortpass filter SP600Semrock FF01-612/SP-25509 to 591 nm (Tavg>90%)
Longpass filter LP600Edmund Optics 84746635 to 1650 nm
Shortpass filter SP700Edmund Optics 84714400 to 685 nm
Internal lens L2Edmund Optics 59871
Imaging sensorVRmagic VRmMS-12 (using an Aptina MT9V024 CMOS imaging sensor)

The external camera lens and the two internal lenses L1 and L2 yield an optical system with an effective focal length of 30 mm and a field of view of 8.7  deg×5.6  deg. The purpose of the intermediate focal plane in front of the internal lens L1 is to provide space for an (optional) element to protect the sensor system against laser damage. Such a protection element could be, for example, an optical power limiter,33,34 which has to be located in a focal plane for optimal performance.

An external computer powers and controls the sensor system via USB connection. The computer also retrieves the three sensor images and computes a fusion image (see Sec. 4).

4.

Image Fusion

For the generation of the fusion image, a wide variety of methods could be used. Since our system has three channels corresponding to red, green, and blue light, an obvious possibility would be the generation of an RGB image using the three single images. As we will see later in Sec. 5.2, using this method results in a loss of contrast in the fusion image when laser dazzle occurs. We therefore decided to use an image fusion algorithm that produces a monochrome output image.

The image fusion algorithm is principally based on a simple calculation of mean signal values resulting from each individual pixel of the three single images. However, prior to the calculation of the mean values, the algorithm examines if saturation of pixels occurs in one of the spectral bands due to narrowband light radiation. We define a pixel as saturated when having a pixel value larger than the (arbitrarily chosen) threshold value of 250 for 8-bit images (maximum pixel value is 255). In that case, the overexposed pixels are neglected for the calculation of the mean value. This is shown in Fig. 4.

Fig. 4

Images taken by the three-band sensor while dazzled by a laser (laser wavelength λ=640  nm, radiant exposure H=0.02  μJ/cm2 at the optics entrance): (a) blue, (b) green, and (c) red channels, and (d) fusion image.

OE_58_1_013109_f004.png

Figures 4(a)4(c) show three different spectral images taken by the three-band sensor (B: blue channel, G: green channel, and R: red channel) of a test chart containing triangles. For test purposes, the sensor system was illuminated with laser radiation (wavelength λ=640  nm; radiant exposure H=0.02  μJ/cm2 at the optics entrance) resulting in a dazzle spot in the red channel. For the fused image in Fig. 4(d), the overexposed pixels of the red channel were not taken into account for the calculation of the mean value. Thus, in the center part of the fused image, the mean value was calculated just by the two pixel values of the blue and green channels. Therefore, the triangles in the center part of the fused image are clearly visible. If corresponding pixels are saturated in all three channels, we attribute this effect to a broadband light source (e.g., the sun), and in this case the saturated pixels are not neglected.

For an observer, the occurrence of laser dazzle is easily perceptible in the fused image due to different contrasts within the image. In Fig. 4(d), the mean pixel values in the center of the image comprise only of two pixel values, whereas the mean values of the residual area comprise of three pixel values. The different contrasts are the result of differences in the channel’s responsivity. The signal of a particular channel depends on the corresponding optics transmittance and the integration time of the dedicated sensor. Since the optical elements in the three channels cover different spectral regions, the differences of the transmittance and the wavelength-dependent sensor responsivity have to be compensated by adjusting the integration time of the imaging sensor accordingly. At best, there would be the same signal when the sensor system looks at a homogenous background. However, the setting of the imaging sensor’s integration times could not be accomplished perfectly, especially due to different vignetting in the three channels. For the measurements shown in Fig. 4, the integration times were set to 6.5, 9.0, and 8.0 ms for the red, green, and blue channels, respectively.

5.

System Performance

The performance of the three-band sensor was investigated using different methods. First, we performed theoretical calculations regarding the channel transmittance and spectral separation. These calculations were compared with the results of measurements (see Sec. 5.1). Second, the sensor system was tested in a field trial (see Sec. 5.2). Third, the three-band sensor was assessed quantitatively using various methods (see Sec. 5.3).

5.1.

Optics Transmittance and Spectral Separation

To estimate the system’s performance, we calculated at first the spectral transmittance of the three optical channels. For the calculation, we used transmittance data of the optical elements of Table 1 as specified by the manufacturers. For data not available in digital form, we digitized the regarding transmittance curves provided by the manufacturer. Figure 5 shows the calculated transmittance (ordinate presented in diabatic scale) as well as the corresponding optical density curves. The calculations were carried out for unpolarized as well as for polarized light since the transmittance/reflection of the beamsplitters is polarization dependent. The different colors (blue, green, and red) in the plots correspond to the different channels, whereas the line styles of the curves (solid, dashed, and dotted) correspond to the polarization state of the incident light (unpolarized, s-polarized, and p-polarized), respectively. Additionally, the range of values between minimum and maximum transmittance/optical density is highlighted by colored bands in the plots. From the optical density plots, we learn that unpolarized light conditions are a reasonable approximation for the worst-case situation, meaning the appearance of the lowest optical densities.

Fig. 5

Calculated optical properties of the three-band sensor: (a) transmittance and (b) optical density as a function of wavelength for unpolarized and polarized light. The passbands of the three channels are highlighted by hatched, colored background in the transmittance plot.

OE_58_1_013109_f005.png

Defining the passbands of the channels as the spectral range where the transmittance is >50% of the maximum channel transmittance, we get mean transmittance values for unpolarized light of 0.74, 0.75, and 0.73 for the blue, green, and red channels, respectively. The passbands are marked in the transmittance plot of Fig. 5 by hatched, colored backgrounds. The passbands calculated this way cover wavelength ranges of 391 to 481 nm, 525 to 576 nm, and 629 to 700 nm.

For comparison, since all the calculations were performed using only nominal data released by the manufacturers, we also measured the transmittance of the three-band sensor’s single channels as a function of wavelength. For this purpose, we used a supercontinuum light source (Koheras SuperK Extreme) equipped with an acousto-optical tunable filter (AOTF). By means of the AOTF, we generated narrowband radiation adjustable in the spectral range from 470 to 725 nm in steps of 5  nm. Since the theoretical calculations suggest optical densities of OD>6, the AOTF output was spectrally cleaned with an appropriate bandpass filter (FWHM 10 nm) for each measurement point. Otherwise, the broadband background still present in the AOTF output would prevent the measurement of such high values of optical density. The output of the AOTF is horizontally polarized (p-polarization).

Additionally, we made measurements with a multiwavelength laser source (Toptica iChrome MLE-L) at four specific wavelengths: 488, 515, 561, and 640 nm. The laser output was linearly polarized and was adjusted to horizontal (p) or vertical (s) polarization using a half-wave plate.

We measured the radiation power in front of the optics entrance as well as at the position of the imaging sensor using an optical power meter and calculated the corresponding channel transmittance/optical density. Figure 6 shows the results of the calculations compared to measurements. The data points represent the measured values for the supercontinuum (SC) source and the laser source. The theoretical data are only shown by colored bands corresponding to the range of values between maximum and minimum transmittance/optical density just as in Fig. 5.

Fig. 6

Calculated (solid lines) and measured (data points) optical properties of the three-band sensor: (a) transmittance and (b) optical density as a function of wavelength.

OE_58_1_013109_f006.png

From the plot, the following results can be concluded:

  • The real channel transmittance is lower than the calculated one. The estimated passband transmittance obtained from the experimental data of the SC source (using the same definition as stated above) is 0.53, 0.64, and 0.68 for the blue, green, and red channels, respectively. The value encountered for the blue channel should be taken with care since only three data points contribute to the estimated value of 0.53.

  • The course of the measured data is smoother than the theoretical lines.

  • For the green and red channels, the measurements correspond roughly to the calculated values. However, for the green channel, the shoulder in the theoretical line at wavelengths around 500 nm is not present in the measurement data.

  • In the wavelength range between 470 and 695 nm, at least one channel has an optical density >6, which means that the spectral separation is >60  dB. For the blue channel, we measured a considerably higher optical density for wavelengths outside the passband (ODmeas8 instead of ODcalc6) as expected by the theoretical calculations.

5.2.

Field Trial

The practical applicability of the concept of complementary bands was tested in a field trial. For that, we used two laser sources at the wavelengths λ=532  nm (Z-Laser ZM18GF024) and λ=656  nm (Z-Laser ZM18RF058). The laser power and full angle divergence (1/e2) of the two laser sources were 39 mW, 0.34 mrad and 4.5 mW, 0.26 mrad for the wavelengths of 532 and 656 nm, respectively. The sensor system was illuminated successively with both laser sources while observing a scene comprising a hut on a meadow.

Figure 7 shows examples of images taken with the sensor system during the test: Fig. 7(a) for the laser wavelength 532 nm and Fig. 7(b) for the laser wavelength 656 nm. The three images of the imaging sensors are labeled with B, G, and R for the blue, green, and red channels, respectively. Furthermore, three fusion images based on different approaches are shown: image labeled M shows a simple mean image of all three channels and image labeled FM represents a filtered mean image. The filtered mean image FM was calculated as described in Sec. 4, taking into account only nonsaturated pixel values. The image RGB was generated by creating an RGB color image using the images of the red, green, and blue channels.

Fig. 7

Example images taken with the three-band sensor during a field trial: (a) laser wavelength λ=532  nm and (b) laser wavelength λ=656  nm. B: Image of the blue channel, G: image of the green channel, R: image of the red channel, M: mean image, FM: filtered mean image, and RGB: color image.

OE_58_1_013109_f007.png

Looking at the images of Fig. 7, we can conclude that the three-band sensor is indeed capable of preserving the scene’s information in case of laser dazzle. The images show situations where one of the channels is (nearly) completely jammed; however, an operator can still interpret the scene in the fusion image. A simple calculation resulting in a mean image leads to some decreased contrast, whereas the filtered mean image shows a high potential for keeping the image contrast. While this fusion algorithm works very well in case of the laser wavelength of 656 nm [see Fig. 7(b)], there occurred an unexpected effect in the case of the laser wavelength of 532 nm [see Fig. 7(a)]. The filtering of saturated pixels seems to work only for about half the image. The reason for that is that the pixel values of the green channel’s image are below the threshold value of 250 on the left-hand side of the image. Therefore, these pixels are not considered as overexposed by the algorithm. This seems to be a specific problem of the imaging sensor used for the green channel and could be solved by replacing this imaging sensor or adjusting the threshold value of the fusion algorithm appropriately.

Additionally, Fig. 7 shows RGB color images generated using the three images of the red, green, and blue channels as the color components. As we can see, for a human observer, the perceptible contrast of such a fusion image is lower than the contrast of the simple mean image.

5.3.

Quantitative Assessment of System Performance

In the last years, we focused our research also on the quantitative assessment of protection measures for imaging sensors against laser dazzle.2730 Three different methods were studied:

  • Overexposed pixel counting (OPC): This method was established by Benoist and Schleijpen.12 The diameter of the dazzle spot is estimated by counting the number of overexposed pixels in a sensor image. Using the number of overexposed pixels, the diameter of a disc containing the same amount of pixels is calculated. For a protected sensor, a smaller diameter is expected for a specific laser irradiance compared to an unprotected sensor.

  • Pattern recognition: Based on earlier work of Durécu et al.,13,14 we use pattern recognition methods to assess the amount of information loss in dazzled sensor images. The sensor observes a test chart containing triangular test patterns, which orientations have to be recognized [triangle orientation discrimination (TOD)]. The triangles are located on concentric rings with eccentricities of 1 deg, 2 deg, 3 deg, 4 deg, and 5 deg and have sizes ranging from 0.1 deg to 0.5 deg [see Fig. 8(a)].

  • Structural similarity (SSIM) index: Another method to quantify laser dazzle is to calculate the SSIM index.2730 The SSIM index is a metric for measuring the quality of an image by comparing it to a distortion-free reference image.35 This metric is based on the assumption that the human visual system is designed to recognize structures in images and estimates to what extent two images exhibit the same structures. Usually, SSIM is used to assess the quality of image compression algorithms. In our case, we use SSIM to compare images taken with a sensor dazzled by laser light with respect to images taken without laser dazzling. Thus, we gain a measure to estimate how much of an image’s information can be retrieved when particular protection measures are applied. To perform our measurements, we used a highly structured, fractal test chart according to the work of Landeau36 [see Fig. 8(b)].

Fig. 8

Layout of a (a) TOD test chart and (b) an SSIM test chart for quantitative assessment of laser protection performance. (c) Sketch of the experimental setup.

OE_58_1_013109_f008.png

In this paper, we will not go into further details of these methods and refer the reader to the aforementioned references. In earlier publications, we already presented various measurements to assess quantitatively the three-band sensor.27,28 We recognized that the OPC method is not applicable to the three-band sensor since the image fusion algorithm filters out the overexposed pixels. Therefore, in the fusion image, no overexposed pixels occur, which apparently results in a perfect protection performance. In this paper, we will summarize the results for the three-band sensor using the TOD and SSIM methods.

A sketch of the experimental setup is shown in Fig. 8(c). Dazzling was performed using a multiwavelength laser source iChrome MLE-L from Toptica offering four different laser wavelengths (488, 515, 561, and 640 nm). The three-band sensor observed the test chart from a distance of 5.14 m. A hole in the center of the test chart allowed illuminating the three-band sensor with laser radiation along the optical axis. However, due to the spectral characteristics of our three-band sensor, only the two larger wavelengths 561 and 640 nm caused a noticeable effect since the lower wavelengths 488 and 515 nm are out of the sensor’s passbands. Therefore, only these two wavelengths could be used for the quantitative assessment of system performance.

For each wavelength, the three-band sensor was illuminated with different irradiances, and sensor images were taken for each irradiance level. Subsequently, we analyzed the images according to the TOD and SSIM methods.

5.3.1.

Triangle orientation discrimination method

Using the TOD method, we estimated the fraction of correctly recognized triangle orientations (further denoted as “fraction correct”) in the dazzled sensor images. For this, we used a correlation-based template-matching algorithm to recognize equilateral triangles in the images and estimated their orientation. Figure 9(a) shows an example of the images taken by the three-band sensor showing the test chart with triangles of size 0.1 deg. All those triangles, whose orientation was recognized correctly, are marked by a circle, whereas the color of the circle indicates the orientation. Those triangles, whose orientation could not be recognized correctly, are marked by squares. For each value of eccentricity, triangle size, and irradiance, the fraction correct was calculated from the image analysis results.

Fig. 9

(a) Image of the test chart taken with the three-band sensor. The triangles, whose orientation was recognized correctly, are marked with circles. The triangles with wrongly recognized orientation are marked with squares. The image corresponds to the data points marked in the plot by green circles/arrows. (b) Fraction correct as a function of radiant exposure for the three-band sensor using a laser wavelength of 561 nm.

OE_58_1_013109_f009.png

Figure 9(b) shows a plot of the fraction correct for laser illumination with a wavelength of λ=561  nm as a function of radiant exposure (irradiance integrated over exposure time). Only for the smallest triangles (size 0.1 deg and 0.2 deg), a noticeable effect can be seen, particularly for an eccentricity of 4 deg. The results for the wavelength of 640 nm are not shown here since the results are very similar to the results for 561-nm laser wavelength.

In the plot of Fig. 9(b), some data points for triangle size of 0.1 deg are marked by green circles/arrows. These data points present the worst-orientation recognition performance for the three-band sensor. Figure 9(a) shows the corresponding camera image. Looking at this image, we can see that the lowest recognition performance is within areas where the camera image is not overexposed (no filtering active) but the sensor is still disturbed by dazzle light. Better results may be achieved by adapting the filtering algorithm appropriately. The overall system performance regarding laser dazzle is promising according to the TOD method since the fraction correct is not dropping substantially by laser dazzle.

5.3.2.

Structural similarity method

In addition to the TOD method, we also calculated the SSIM index of the sensor images for different values of laser irradiance. As (undisturbed) reference image, we used a sensor image taken of the test chart without laser illumination. In Fig. 10, the SSIM value of the three-band sensor’s fusion image is plotted as function of irradiance (solid lines) for the laser wavelengths of 488, 515, 561, and 640 nm. The result is similar to the TOD results presented above. For the laser wavelengths of 561 and 640 nm, the value of SSIM is quite high for all values of irradiance, but it shows a drop at values of irradiance around 7  μW/cm2. This finding reflects the results of the TOD analysis. For the laser wavelengths of 488 and 515 nm, there is no noticeable effect; the SSIM value is >0.99.

Fig. 10

Results of the SSIM analysis of the three-band sensor, a monochrome CMOS camera and a color CMOS camera.

OE_58_1_013109_f010.png

For purposes of comparison, we also plotted the SSIM results for two standard cameras: a monochrome CMOS camera (VRmagic VRmC-12/BW-Pro, dotted lines in the plot) and a color CMOS camera (VRmagic VRmC-12/C-Pro, dashed lines in the plot). Both CMOS cameras utilize the same imaging sensor as our three-band sensor (Aptina MT9V024) does, whereby the imaging sensor of the color camera is additionally equipped with a Bayer color filter array. For the measurements, we used the same camera lens for the two standard cameras as we use for our three-band sensor, resulting in a similar field of view for all systems. The differences in optics transmittance, caused by the dichroic elements and bandpass filters of the three-band sensor or the RGB filter array of the color CMOS camera, were compensated as good as possible by adjusting the individual exposure times. Thus, the comparison of the SSIM curves in Fig. 10 is assumed appropriate.

Generally one can state, that the comparison of SSIM curves of different devices has to be treated with caution since the devices may feature different parameters (pixel size, number of pixels, field of view, optics transmittance, etc.), which influences the results of the SSIM calculation. In case of the color CMOS camera, the SSIM value was computed independently for the three color channels, and subsequently, we calculated the mean SSIM value of the three channels.

From Fig. 10, we can see that, compared to the three-band sensor, the measured values of SSIM for both standard cameras drop clearly for higher values of irradiance. The SSIM curves decrease with increasing irradiance until the value saturates, when the whole image is overexposed. For the three-band sensor, in contrast, the SSIM curves stay stable on a high value of SSIM. For the color CMOS camera, the decrease of the SSIM value is less pronounced than for the monochrome CMOS camera since the Bayer color filter array protects the imaging sensor against laser dazzle to a certain degree.

6.

Future Work

Although the results presented here show the practical applicability of the principle of complementary bands, there is potential for improvement of the optical layout. Our three-band sensor has three color channels (red, green, and blue), which demands for a color fusion image. For the three-band sensor, we decided to generate a monochrome fusion image due to the loss of perceptible contrast of a color fusion image in case of laser dazzle. However, it would be nice to enhance the principle of complementary bands such that the sensor can offer a colored fusion image that is (nearly) undistorted in case of laser dazzle.

We started to develop such a sensor using multiband optical elements (beam splitters and bandpass filters) and set up a first laboratory demonstrator as a proof of concept. Since an extensive investigation of the performance remains to be done, the new sensor concept will be presented in detail in a subsequent publication.

7.

Conclusions

As a means of protection/hardening against laser dazzle, we built up and assessed the performance of a three-band sensor based on the concept of complementary bands. By choosing appropriate optical elements, the spectral separation of the channels is in the order of 6 magnitudes (60 dB) or larger, assuring only very low cross talk between the channels and thus prevents the sensor from complete overexposure or image saturation. Overall, the concept turned out to be a very good protection measure for imaging sensors against high-power laser pointers.

In more detail, the images of the three channels are fused to a single image to be presented to the sensor operator. The image fusion is based on the calculation of the mean value of the three channels; however, overexposed pixels caused by narrowband light irradiation (laser) will be identified and not taken into account for the calculation. Thus, the image presented to the operator exhibits no loss of contrast in the case of laser dazzle. However, the image fusion algorithm has still potential for improvements.

From a system aspects point, one might argue that our multichannel approach suffers from greater volume, weight, and costs, compared to single-sensor cameras. This might be an issue, for example, for small unmanned aerial vehicles. Furthermore, the accurate alignment of the optics and the imaging sensors to match their field of view is more complex. However, one must consider that we address laser dazzle protection. Typically, a protection measure always involves some drawback. For example, a ballistic vest can protect its wearer from bullets or fragments but exposes him/her to a higher weight and reduces mobility. Thus, it has to be considered whether there is a significant laser threat that affords the use of protection measures (and its adverse effects) or not.

Our three-band sensor has a higher spectral separation of the channels, as compared to standard single-sensor or three-sensor color cameras. For this, the spectral bands had to be chosen narrower, which leads to some signal loss when a natural scene is observed, illuminated by broadband light (e.g., day light). To compensate for such a signal loss, the system operator or the sensor’s automatic exposure control would increase the signal, e.g., by increasing the exposure time. As a side effect of increasing the exposure time, the susceptibility to laser dazzling would also increase. But, involving all sensor channels, our laboratory demonstrator outperforms standard camera sensors in terms of laser dazzle suppression as demonstrated by the SSIM measurements presented in Sec. 5.3.

The advantage of the concept of complementary bands is its simplicity and easy feasibility using already available components. The concept is not a real protection measure since laser light will still dazzle (at least) one imaging sensor. Nevertheless, the resulting fusion image will prevent image information loss in case of laser dazzle.

To circumvent the protection measure, a laser dazzler exhibiting three laser wavelengths corresponding to the three passbands of the sensor would be necessary. In our laboratory demonstrator, the three passbands match blue, green, and red wavebands of the visible spectral range. Since the passbands can be chosen arbitrarily, it would not be obvious for an aggressor to select the appropriate laser wavelengths.

We investigated the applicability of the concept of complementary bands for laser protection/hardening both in field trials and in laboratory experiments; the results are promising. At the moment, the fusion image is still monochrome. We will focus our future work on providing colored fusion images to the sensor operator.

References

1. 

B. Eberle, D. Forster and G. Ritt, “Visible laser dazzle,” Proc. SPIE, 9989 99890J (2016). https://doi.org/10.1117/12.2241041 PSISDG 0277-786X Google Scholar

2. 

C. A. Williamson, “Simple computer visualization of laser eye dazzle,” J. Laser Appl., 28 012003 (2016). https://doi.org/10.2351/1.4932620 JLAPEN 1042-346X Google Scholar

3. 

C. A. Williamson et al., “Measuring the contribution of atmospheric scatter to laser eye dazzle,” Appl. Opt., 54 (25), 7567 –7574 (2015). https://doi.org/10.1364/AO.54.007567 APOPAI 0003-6935 Google Scholar

4. 

J. M. P. Coelho, J. Freitas and C. A. Williamson, “Optical eye simulator for laser dazzle events,” Appl. Opt., 55 (9), 2240 –2251 (2016). https://doi.org/10.1364/AO.55.002240 APOPAI 0003-6935 Google Scholar

5. 

C. A. Williamson and L. N. McLin, “Nominal ocular dazzle distance (NODD),” Appl. Opt., 54 (7), 1564 –1572 (2015). https://doi.org/10.1364/AO.54.001564 APOPAI 0003-6935 Google Scholar

6. 

C. A. Williamson et al., “Wavelength and ambient luminance dependence of laser eye dazzle,” Appl. Opt., 56 (29), 8135 –8147 (2017). https://doi.org/10.1364/AO.56.008135 APOPAI 0003-6935 Google Scholar

7. 

C. A. Williamson and L. N. McLin, “Determination of a laser eye dazzle safety framework,” J. Laser Appl., 30 032010 (2018). https://doi.org/10.2351/1.5029384 JLAPEN 1042-346X Google Scholar

8. 

O. Steinvall et al., “Laser dazzling impacts on car driver performance,” Proc. SPIE, 8898 88980H (2013). https://doi.org/10.1117/12.2028505 PSISDG 0277-786X Google Scholar

9. 

M. Vandewal et al., “Evaluation of laser dazzling induced task performance degradation,” Proc. SPIE, 10797 10797E (2018). https://doi.org/10.1117/12.2325245 PSISDG 0277-786X Google Scholar

10. 

R. H. M. A. Schleijpen et al., “Laser dazzling of focal plane array cameras,” Proc. SPIE, 6543 65431B (2007). https://doi.org/10.1117/12.718602 PSISDG 0277-786X Google Scholar

11. 

R. H. M. A. Schleijpen et al., “Laser dazzling of focal plane array cameras,” Proc. SPIE, 6738 67380O (2007). https://doi.org/10.1117/12.747009 PSISDG 0277-786X Google Scholar

12. 

K. W. Benoist and R. H. M. A. Schleijpen, “Modeling of the over-exposed pixel area of CCD cameras caused by laser dazzling,” Proc. SPIE, 9251 92510H (2014). https://doi.org/10.1117/12.2066305 PSISDG 0277-786X Google Scholar

13. 

A. Durécu et al., “Assessment of laser-dazzling effects on TV-cameras by means of pattern recognition algorithms,” Proc. SPIE, 6738 67380J (2007). https://doi.org/10.1117/12.737264 PSISDG 0277-786X Google Scholar

14. 

A. Durécu, O. Vasseur and P. Bourdon, “Quantitative assessment of laser-dazzling effects on a CCD-camera through pattern-recognition-algorithms performance measurements,” Proc. SPIE, 7483 74830N (2009). https://doi.org/10.1117/12.833975 PSISDG 0277-786X Google Scholar

15. 

C. N. Santos et al., “Visible and near-infrared laser dazzling of CCD and CMOS cameras,” Proc. SPIE, 10797 107970S (2018). https://doi.org/10.1117/12.2325631 PSISDG 0277-786X Google Scholar

16. 

T. Özbilgin and A. Yeniay, “Laser dazzling analysis of camera sensors,” Proc. SPIE, 10797 107970Q (2018). https://doi.org/10.1117/12.2325393 PSISDG 0277-786X Google Scholar

17. 

E. I. L. Jull and H. F. Gleeson, “Tuneable and switchable liquid crystal laser protection system,” Appl. Opt., 56 (29), 8061 –8066 (2017). https://doi.org/10.1364/AO.56.008061 APOPAI 0003-6935 Google Scholar

18. 

F. Quercioli, “Beyond laser safety glasses: augmented reality in optics laboratories,” Appl. Opt., 56 (4), 1148 –1150 (2017). https://doi.org/10.1364/AO.56.001148 APOPAI 0003-6935 Google Scholar

19. 

J. H. Wirth, A. T. Watnik and G. A. Swartzlander, “Experimental observations of a laser suppression imaging system using pupil-plane phase elements,” Appl. Opt., 56 (33), 9205 –9211 (2017). https://doi.org/10.1364/AO.56.009205 APOPAI 0003-6935 Google Scholar

20. 

G. J. Ruane, A. T. Watnik and G. A. Swartzlander, “Reducing the risk of laser damage in a focal plane array using linear pupil-plane phase elements,” Appl. Opt., 54 (2), 210 –218 (2015). https://doi.org/10.1364/AO.54.000210 APOPAI 0003-6935 Google Scholar

21. 

G. Ritt and B. Eberle, “Automatic laser glare suppression in electro-optical sensors,” Sensors, 15 (1), 792 –802 (2015). https://doi.org/10.3390/s150100792 SNSRES 0746-9462 Google Scholar

22. 

G. Ritt and B. Eberle, “Automatic suppression of intense monochromatic light in electro-optical sensors,” Sensors, 12 (10), 14113 –14128 (2012). https://doi.org/10.3390/s121014113 SNSRES 0746-9462 Google Scholar

23. 

G. Ritt and B. Eberle, “Electro-optical sensor with automatic suppression of laser dazzling,” Proc. SPIE, 8541 85410P (2012). https://doi.org/10.1117/12.971186 PSISDG 0277-786X Google Scholar

24. 

G. Ritt and B. Eberle, “Protection concepts for optronical sensors against laser radiation,” Proc. SPIE, 8185 81850G (2011). https://doi.org/10.1117/12.897700 PSISDG 0277-786X Google Scholar

25. 

G. Ritt and B. Eberle, “Electro-optical sensor with spatial and spectral filtering capability,” Appl. Opt., 50 (21), 3847 –3853 (2011). https://doi.org/10.1364/AO.50.003847 APOPAI 0003-6935 Google Scholar

26. 

G. Ritt and B. Eberle, “Sensor protection against laser dazzling,” Proc. SPIE, 7834 783404 (2010). https://doi.org/10.1117/12.864960 PSISDG 0277-786X Google Scholar

27. 

G. Ritt and B. Eberle, “Evaluation of protection measures against laser dazzling for imaging sensors,” Opt. Eng., 56 (3), 033108 (2017). https://doi.org/10.1117/1.OE.56.3.033108 Google Scholar

28. 

G. Ritt and B. Eberle, “Evaluation of protection measures against laser dazzling for imaging sensors,” Proc. SPIE, 9989 99890N (2016). https://doi.org/10.1117/12.2241040 PSISDG 0277-786X Google Scholar

29. 

G. Ritt et al., “Protection performance evaluation regarding imaging sensors hardened against laser dazzling,” Opt. Eng., 54 (5), 053106 (2015). https://doi.org/10.1117/1.OE.54.5.053106 Google Scholar

30. 

G. Ritt et al., “Protection performance evaluation regarding imaging sensors hardened against laser dazzling,” Proc. SPIE, 9249 924908 (2014). https://doi.org/10.1117/12.2067147 PSISDG 0277-786X Google Scholar

31. 

S. Svensson et al., “Countering laser pointer threats to road safety,” Proc. SPIE, 6402 640207 (2006). https://doi.org/10.1117/12.689057 PSISDG 0277-786X Google Scholar

32. 

G. Ritt, B. Schwarz and B. Eberle, “Preventing image information loss of imaging sensors in case of laser dazzle,” Proc. SPIE, 10797 107970R (2018). https://doi.org/10.1117/12.2325307 PSISDG 0277-786X Google Scholar

33. 

G. Ritt, S. Dengler and B. Eberle, “Protection of optical systems against laser radiation,” Proc. SPIE, 7481 74810U (2009). https://doi.org/10.1117/12.829963 PSISDG 0277-786X Google Scholar

34. 

G. Ritt, D. Walter and B. Eberle, “Research on laser protection: an overview of 20 years of activities at Fraunhofer IOSB,” Proc. SPIE, 8896 88960G (2013). https://doi.org/10.1117/12.2029083 PSISDG 0277-786X Google Scholar

35. 

Z. Wang et al., “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. Image Process., 13 (4), 600 –612 (2004). https://doi.org/10.1109/TIP.2003.819861 IIPRE4 1057-7149 Google Scholar

36. 

S. Landeau, “Evaluation of super-resolution imager with binary fractal test target,” Proc. SPIE, 9249 924909 (2014). https://doi.org/10.1117/12.2067499 PSISDG 0277-786X Google Scholar

Biography

Gunnar Ritt is a research associate at Fraunhofer Institute of Optronics, System Technologies and Image Exploitation (IOSB), Ettlingen, Germany. He received his diploma and PhD degrees in physics from the University of Tübingen, Germany, in 1999 and 2007, respectively. His main research focus is on laser protection.

Bastian Schwarz has been a research scientist at Fraunhofer IOSB, Ettlingen, Germany, since 2013. He graduated in physics from the University of Freiburg in 2012 and worked at the Kiepenheuer Institute for Solar Physics. Since 2013, his research areas include laser protection and laser damage performance.

Bernd Eberle is a senior scientist at Fraunhofer IOSB in Ettlingen, Germany, where he is head of the optical countermeasure and laser protection group. He received his diploma degree in physics from the University of Konstanz in 1983. He received his PhD degree in physics from the University of Konstanz in 1987. His research activities include laser technology, laser spectroscopy, nonlinear optics, femtosecond optics, optical countermeasures, including protection against laser radiation, and imaging laser sensors.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Gunnar Ritt, Bastian Schwarz, and Bernd Eberle "Preventing image information loss of imaging sensors in case of laser dazzle," Optical Engineering 58(1), 013109 (30 January 2019). https://doi.org/10.1117/1.OE.58.1.013109
Received: 9 November 2018; Accepted: 8 January 2019; Published: 30 January 2019
Lens.org Logo
CITATIONS
Cited by 8 scholarly publications.
Advertisement
Advertisement
KEYWORDS
Image fusion

Sensors

Image sensors

Optical sensors

Cameras

Transmittance

Image filtering

RELATED CONTENT

An efficient algorithm for matching of SLAM video sequences
Proceedings of SPIE (September 28 2016)
Smart filters for image sensor enhancement
Proceedings of SPIE (May 13 2019)
Control Of Welding Using Optical Sensing
Proceedings of SPIE (November 29 1988)

Back to Top