We propose a novel approach to predict, for specified color imaging system and for objects with known characteristics, their detection, recognition, identification (DRI) ranges in a colored dynamic scene, based on quantifying the human color contrast perception.<p> </p>The method refers to the well established L*a*b*, 3D color space. The nonlinear relations of this space are intended to mimic the nonlinear response of the human eye. The metrics of L*a*b* color space is such that the Euclidian distance between any two colors in this space is approximately proportional to the color contrast as perceived by the human eye/brain. The result of this metrics leads to the outcome that color contrast of any two points is always greater (or equal) than their equivalent grey scale contrast. This meets our sense that looking on a colored image, contrast is superior to the gray scale contrast of the same image. Yet, color loss by scattering at very long ranges should be considered as well.<p> </p>The color contrast derived from the distance between the colored object pixels and to the nearby colored background pixels, as derived from the L*a*b* color space metrics, is expressed in terms of gray scale contrast. This contrast replaces the original standard gray scale contrast component of that image. As expected, the resulted DRI ranges are, in most cases, larger than those predicted by the standard gray scale image. Upon further elaboration and validation of this method, it may be combined with the next versions of the well accepted TRM codes for DRI predictions.<p> </p>Consistent prediction of DRI ranges implies a careful evaluation of the object and background color contrast reduction along the range. Clearly, additional processing for reconstructing the objects and background true colors and hence the color contrast along the range, will further increase the DRI ranges.
Electro-optical missile seekers pose exceptional requirements for infrared (IR) detectors. These requirements include: very short mission readiness (time-to-image), one-time and relatively short mission duration, extreme ambient conditions, high sensitivity, fast frame rate, and in some cases small size and cost. SCD is engaged in the development and production of IR detectors for missile seeker applications for many years. 0D, 1D and 2D InSb focal plane arrays (FPAs) are packaged in specially designed fast cool-down Dewars and integrated with Joule-Thomson (JT) coolers. These cooled MWIR detectors were integrated in numerous seekers of various missile types, for short and long range applications, and are combat proven. New technologies for the MWIR, such as epi-InSb and XBn-InAsSb, enable faster cool-down time and higher sensitivity for the next generation seekers. The uncooled micro-bolometer technology for IR detectors has advanced significantly over the last decade, and high resolution - high sensitivity FPAs are now available for different applications. Their much smaller size and cost with regard to the cooled detectors makes these uncooled LWIR detectors natural candidates for short and mid-range missile seekers. In this work we will present SCD's cooled and uncooled solutions for advanced electro-optical missile seekers.
A novel multispectral video system that continuously optimizes both its spectral range channels and the
exposure time of each channel autonomously, under dynamic scenes, varying from short range-clear
scene to long range-poor visibility, is currently being developed. Transparency and contrast of high
scattering medium of channels with spectral ranges in the near infrared is superior to the visible
channels, particularly to the blue range. Longer wavelength spectral ranges that induce higher contrast
are therefore favored. Images of 3 spectral channels are fused and displayed for (pseudo) color
visualization, as an integrated high contrast video stream.
In addition to the dynamic optimization of the spectral channels, optimal real-time exposure time is
adjusted simultaneously and autonomously for each channel. A criterion of maximum average signal,
derived dynamically from previous frames of the video stream is used (Patent Application -
International Publication Number: WO2009/093110 A2, 30.07.2009). This configuration enables
dynamic compatibility with the optimal exposure time of a dynamically changing scene. It also
maximizes the signal to noise ratio and compensates each channel for the specified value of daylight
reflections and sensors response for each spectral range.
A possible implementation is a color video camera based on 4 synchronized, highly responsive, CCD
imaging detectors, attached to a 4CCD dichroic prism and combined with a common, color corrected,
lens. Principal Components Analysis (PCA) technique is then applied for real time "dimensional
collapse" in color space, in order to select and fuse, for clear color visualization, the 3 most significant
principal channels out of at least 4 characterized by high contrast and rich details in the image data.
We present a model for calculating the Spatial Frequency Response (SFR) for Bayer pattern color detectors. The model
is based on the color detector response to B/W scenes. When a Bayer color detector is compared to a B/W detector, SFR
difference results from the interpolation process. This process exists only in the Bayer pattern detectors. In this work we
ascribe the MTF and the spurious response to the interpolation process.
The model may be applied to any linear interpolation. Although the interpolation is linear, it is not Shift Invariant
(SI). Therefore, calculating the interpolation MTF is not a trivial task. Furthermore, the interpolation creates a spurious
response. In order to calculate the interpolation SFR, we introduce a separable constraint (for x and y directions) by using
a scene that varies only on one axis and is fixed on the other. We further assume integration in the direction of the fixed
axis. By using these two assumptions, we have been able to separate the response into two axes and calculate the SFR.
For distant scenes, colors saturation decreases, the colors are less visible and mostly grey colors are sensed. In these
cases the Johnson Criterion can be roughly applied. In order to apply the Johnson Criterion it is required to know the
MTF of the sensing system. The sensing system MTF includes the interpolation MTF. We show that the interpolation
process degrades the system performance compared to B/W sensor. Another application of the model is in comparing
different interpolation algorithms.
Simulation of scenarios for modern seekers involves the generation of multiple targets, electro-optical counter measures and textured backgrounds, all with realistic physical characteristics. True intensity, spectral distribution, real angular size and velocity are essential. The optical and radiometric design approach is based on imaging the seeker entrance pupil on different positions on a scene generation table. This paper presents some of the novel system characteristics. The optics, comprising mostly reflective surfaces and uncoated beam combiners, provides wide infrared and visible spectral range. The system is designed for high resolution over a large 10 degree field of view and is optimized for maximum intensity. Gimbaled mirror axes states are transformed into real world line of sight (LOS) motion. The whole scene may be tilted with respect to the seeker axis at large angles with derotation compensation. Targets, flares and thermal backgrounds are implemented using diversified types of thermal radiators, and their intensity and size closing effects are controlled by opto/electro-mechanical assemblies.
The TSG is used for evaluating infrared missile seekers by dynamic targets and EOCM realistic environment. The system generates multi mode primary and secondary targets, up to three flares and jammers combined with thermal background image of 10 degree(s) field of view. Each component is independently controlled to provide 2D trajectory, velocity and acceleration. Four orders of magnitude in LOS angular velocity can be accomplished. The system allows for variation of sources angular size, radiated intensity and other spatial and temporal modulation. All sources are combined in a collimated output beam. The beam is further projected through an optical relay and a Field Of Regard assembly. This mechanism displays the whole scenario in a wide angle span onto the seeker aperture. Further system improvements involves combining dynamic infrared scene projector with high temperature sources under real time high dynamics, for better performances with imaging seekers of maneuvered platforms.