The Army has identified a need to rapidly identify, map, and classify natural and manmade features to aid situational
awareness as well as mission and tactical planning. To address these needs, Digital Fusion and Trex Enterprises have
designed a full Stokes, passive MMW imaging polarimeter that is capable of being deployed on an unmanned aerial
vehicle. Results of a detailed trade study are presented, where an architecture, waveband and target platform are
selected. The selected architecture is a pushbroom phased-array system, which allows the system to collect a wide fieldof-
view image with minimal components and weight. W band is chosen as a trade-off between spatial resolution,
weather penetration, and component availability. The trade study considers several unmanned aerial system (UAS)
platforms that are capable of low-level flight and that can support the MMW antenna. The utility of the passive Stokes
imager is demonstrated through W band phenomenology data collections at horizontal and vertical polarization using a
variety of natural and manmade materials. The concept design is detailed, along with hardware and procedures for both
radiometric and polarimetric calibration. Finally, a scaled version of the concept design is presented, which is being
fabricated for an upcoming demonstration on a small, manned aircraft.
Imaging polarimetry is an emerging sensor technology that promises to improve the performance of sensor
systems when used as an adjunct to conventional intensity-based imaging. Several prototype systems capable of being
deployed from aircraft are under development. One system has successfully completed an airborne military utility
assessment and is being transitioned to operational status. As this technology continues to gain interest, it will become
necessary to both accurately predict the performance of proposed systems before they are fabricated as well as develop
modeling and simulation tools that will allow their performance to be evaluated for various operational scenarios. In
this paper we develop several performance prediction tools that can be used to address these needs; these models are
based on the micro-polarizer array (MPA) implementation of imaging polarimeters as this architecture is at the
forefront in the development of deployable systems.
Focal plane array (FPA) well size, polarizer extinction ratio (ER), pixel crosstalk, and processing algorithms
all play roles in the performance that can be attained by a proposed sensor. We discuss the polarimetric response of an
MPA-based polarimetric detector and use this model to illustrate the effects of these parameters on the sensor's
polarimetric performance, which we cast as noise equivalent degree of linear polarization (NeDoLP). Key conclusions
from these analyses are that the detector well size sets the upper limit on performance and that pixel crosstalk will
likely the biggest contributor to polarimetric loss in most systems.
Accurate calibration of polarimetric sensors is critical to reducing and analyzing phenomenology data, producing
uniform polarimetric imagery for deployable sensors, and ensuring predictable performance of polarimetric algorithms.
It is desirable to develop a standard calibration method, including verification reporting, in order to increase credibility
with customers and foster communication and understanding within the polarimetric community. This paper seeks to
facilitate discussions within the community on arriving at such standards.
Both the calibration and verification methods presented here are performed easily with common polarimetric equipment,
and are applicable to visible and infrared systems with either partial Stokes or full Stokes sensitivity. The calibration
procedure has been used on infrared and visible polarimetric imagers over a six year period, and resulting imagery has
been presented previously at conferences and workshops.
The proposed calibration method involves the familiar calculation of the polarimetric data reduction matrix by
measuring the polarimeter's response to a set of input Stokes vectors. With this method, however, linear combinations
of Stokes vectors are used to generate highly accurate input states. This allows the direct measurement of all system
effects, in contrast with fitting modeled calibration parameters to measured data. This direct measurement of the data
reduction matrix allows higher order effects that are difficult to model to be discovered and corrected for in calibration.
This paper begins with a detailed tutorial on the proposed calibration and verification reporting methods. Example
results are then presented for a LWIR rotating half-wave retarder polarimeter.
The proliferation of remotely operated vehicles (ROVs) has resulted in a need for the capability to see the operational environment in stereo. In a previous paper the theoretical underpinnings for new types of stereoscopic and autostereoscopic flat-panel displays with full-time, full-resolution images (i.e., no temporal multiplexing and no spatial multiplexing) were presented. Recently, a stereoscopic prototype has been constructed at the U.S. Army Aviation & Missile RDEC and testing is underway. The research presented here describes the application of two liquid crystal displays (LCD) sandwiched together to form a compact, rugged stereoscopic display. Polarized glasses are used to view the image in stereo. The prototype provides a full-time, full-resolution stereoscopic 3D display in a package slightly thicker, but no larger, than the standard liquid crystal display used in laptop computers. The LCDs have been characterized using a Stokes vector polarimeter. The characterization results were very interesting and led to some changes in the encoding algorithms. Significant improvements in the display quality were achieved through these adaptations.
Polarimetric imagery that is collected from time-sequential and multiple image format sensors all have potential for image misregistration. Since polarization is usually measured as small differences between radiometric measurements, it is highly sensitive to misregistration, especially at regions of high contrast. The general consensus in the polarization community is that image misregistration on the order of 1/10th of a pixel can introduce artifacts in polarization images. If the registration is not achieved and maintained to this resolution, the data must be registered in software. Typically, rotation and translation (horizontal and vertical) are the main transformations that need to be corrected. It is desirable to have a registration algorithm that determines rotations and translations to 1/10th of a pixel, does not require user intervention, takes minimal computation time, and is based on analytical (non-iterative), automated calculations. This paper details an analytical, automated registration algorithm that corrects for rotation and translations by using a Fourier transform technique. Examples of images registered with this algorithm, and estimates of residual misregistrations are presented. Typical processing times are also given.
An attractive approach to realizing a real-time imaging polarimeter is to integrate an array of polarization- sensitive filers directly onto the focal plane array. This has the advantage of allowing all of the requisite polarization data to be acquired within each image frame. In this paper we discuss the design, fabrication, and performance of a diffractive optical element (DOE) that fulfills this requirement. The DOE consists of an array of broadband form birefringent quarter-wave plates and wire grid polarizers which are designed to allow the measurement of all four Stokes vector components for each image pixel.
The Defence Evaluation and Research Agency (DERA) has a requirement for an IRPC system to detect surface laid and buried anti-tank landmines in support of Phase 2 of the REmote Minefield Detection System Technology Demonstration Program. Nichols Research Corporation is currently under contact to DERA to design and fabricate the IRPC system for integration in the REMIDS TDP. The IRPC is a Stokes 4-vector IR camera system designed to operate form a static tower, a moving elevated surface platform or a moving airborne platform and will be used to demonstrate the usefulness of passive IR polarimetry for mine and minefield detection. DERA will use the IRPC system to investigate the feasibility of using polarimetric techniques to detect buried and surface laid mines from an airborne platform when operated in conjunction with an ultra wideband SAR.
The problem of mine and minefield detection continues to provide a significant challenge to sensor systems. Although the various sensor technologies (infrared, ground penetrating radar, etc.) may excel in certain situations there does not exist a single sensor technology that can adequately detect mines in all conditions such as time of day, weather, buried or surface laid, etc. A truly robust mine detection system will likely require the fusion of data from multiple sensor technologies. The performance of these systems, however, will ultimately depend on the performance of the individual sensors. Infrared (IR) polarimetry is a new and innovative sensor technology that adds substantial capabilities to the detection of mines. IR polarimetry improves on basic IR imaging by providing improved spatial resolution of the target, an inherent ability to suppress clutter, and the capability for zero (Delta) T imaging. Nichols Research Corporation (Nichols) is currently evaluating the effectiveness of IR polarization for mine detection. This study is partially funded by the U.S. Army Night Vision & Electronic Sensors Directorate (NVESD). The goal of the study is to demonstrate, through phenomenology studies and limited field trials, that IR polarizaton outperforms conventional IR imaging in the mine detection arena.
Nichols Research Corporation is currently developing innovative imaging polarimetric sensors for a number of applications such as mine and minefield detection, aircraft ice detection, and remote sensing. The wave bands in which the various sensors operate include the visible, mid-wave infrared (IR), and long-wave IR bands. This paper will summarize the current research that Nichols is conducting in the field of remote sensing using imaging polarimetric cameras. The polarization signatures of various targets, acquired from ranges up to 10 kilometers, will be presented for all three wave-bands. The benefits obtained using polarimetric imaging will be discussed along with potential applications for this innovative technology including possible astronomical observation applications.
We describe the design and performance of a color real-time autostereoscopic 3D display based on our partial pixel 3D display architecture. The primary optical components are an active-matrix liquid crystal display and a diffractive optical element overlay. The display operates at video frame rats and is driven with a conventional VGA signal. 3D animations with horizontal motion parallax are readily viewable as sets of stereo images. The measured contrast and perceived brightness of the display are excellent, but there are minor flaws in image quality due to secondary images.
The detection and discrimination of man-made objects in a terrain or sky background has long been a challenge to the military. Conventional infrared (IR) systems suffer from poor spatial resolution and have a difficult time imaging targets when there is little or no thermal variation ((Delta) T) in the scene. These applications, as well as applications such as aircraft ice detection, can benefit from an imaging system that can overcome these and other limitations. In this paper an enhanced IR imaging sensor which overcomes the above shortcomings is described. Its advantages are detailed and accompanied by numerous experimental examples. The focus in this paper is on the performance of the sensor, and the benefits derived therefrom, not on sensor processing theory.
We explore the physics of excitations of a small number of quanta in microresonators. In particular, we examine this physics as it relates to the dynamics of nonlinearly coupled microlaser oscillators used to generate time resolved coherent optical wavefronts. We seek wave fronts that can be both stabilized and also rapidly reconfigured by purely electro-optic means. Novel opportunities are offered by reductions in the number of quanta needed for laser, or laser-like action; advances in microcavity nonlinear optics; densely packed arrays of microlasers; adjustable micro- optical delay lines; synchronization of pulse envelopes in physically distinct lasers; and locking of optical fields in physically distinct lasers. Quantum statistical issues could become important, but are not emphasized here. Strategies for realizing an optical analog of high repetition rate agile microwave phased array radar with true delay are examined.
The ICVision system provides the functional equivalent of a real-time holographic stereogram. Using diffractive optics, the system creates a set of discrete viewing regions called virtual viewing slits. Each pixel of the display fills each viewing slit with different image data. When the images presented in two virtual viewing slits separated by an interoccular distance are filled with stereoscopic pair images, the observer sees a 3D image. The images are computed so that a different stereo pair is presented each time the viewer moves approximately 1 eye pupil diameter (approximately 3 mm), thus providing a series of stereo views. The current embodiment of the ICVision display is realized by integrating a diffractive optical element with a conventional AMLCD. The authors have previously reported on the design of static displays and real-time monochromatic full motion displays. This paper discusses the design details of a full color display. The current system does not require the use of color filters within the AMLCD. A portable version of the real-time color display will be demonstrated at the conference.
There is increasing interest in real-time autostereoscopic 3D displays. Such systems allow 3D objects or scenes to be viewed by one or more observers with correct motion parallax without the need for glasses or other viewing aids. Potential applications of such systems include mechanical design, training and simulation, medical imaging, virtual reality, and architectural design. One approach to the development of real-time autostereoscopic display systems has been to develop real-time holographic display systems. The approach taken by most of the systems is to compute and display a number of holographic lines at one time, and then use a scanning system to replicate the images throughout the display region. The approach taken in the ICVision system being developed at the University of Alabama in Huntsville is very different. In the ICVision display, a set of discrete viewing regions called virtual viewing slits are created by the display. Each pixel is required fill every viewing slit with different image data. When the images presented in two virtual viewing slits separated by an interoccular distance are filled with stereoscopic pair images, the observer sees a 3D image. The images are computed so that a different stereo pair is presented each time the viewer moves 1 eye pupil diameter (approximately mm), thus providing a series of stereo views. Each pixel is subdivided into smaller regions, called partial pixels. Each partial pixel is filled with a diffraction grating that is just that required to fill an individual virtual viewing slit. The sum of all the partial pixels in a pixel then fill all the virtual viewing slits. The final version of the ICVision system will form diffraction gratings in a liquid crystal layer on the surface of VLSI chips in real time. Processors embedded in the VLSI chips will compute the display in real- time. In the current version of the system, a commercial AMLCD is sandwiched with a diffraction grating array. This paper will discuss the design details of a protable 3D display based on the integration of a diffractive optical element with a commercial off-the-shelf AMLCD. The diffractive optic contains several hundred thousand partial-pixel gratings and the AMLCD modulates the light diffracted by the gratings.
The ICVision system is a diffractive display based on VLSI and liquid crystal technologies which displays the functional equivalent of a real-time holographic stereogram. We have previously reported several static ICVision displays, based on the partial pixel architecture, that displays a fixed 3D scene. Herein we report the first real-time implementation of an ICVision display (also based on the partial pixel architecture) that displays the functional equivalent of a real-time holographic stereogram. The device is constructed using a diffractive optical element and a separate liquid crystal display. The animated sequence is pre-computed then played back in real-time using standard VGA on a 80386 or higher PC. The display, drive electronics, and computer may be battery powered making the display suitable for portable use.
We report on the development of techniques for imaging the interior of a volume of low density polyethylene. Images are acquired using both near infrared and visible illumination and inexpensive silicon-based CCDs. An application, using these techniques is currently being developed for the inspection of high reliability undersea seals. The system is capable of imaging flaws that are smaller than 1/1000 inch. In addition to image acquisition, the system supports database and image processing functions. Remote manipulation of camera and part positioning is also provided.
The ICVision system is a diffractive display based on VLSI and liquid crystal technology designed to compute and display holographic stereograms in real-time. The diffractive display is formed on the surface of standard integrated circuit chips which have been covered with a liquid crystal overlay. Fringing electrostatic fields generated by indium tin oxide electrodes on top of the integrated circuit are used to induce the actual diffractive display. A large display may be assembled from several hundred individual dies. Within each individual die making up the ICVision display will be the processor that computes the image to be displayed. This paper describes the design of image storage and drive electronics for the ICVision display. The proposed electronics allow the fabrication of an individual static ram cell and d/a converter for each of the tens of thousands of diffractive elements that make up a ICVision display.
The ICVision system is a diffractive display based on VLSI technology. It is designed to display holographic stereograms in real-time. The diffractive display is formed on the surface of standard integrated circuit chips which have been covered with a liquid crystal overlay. Fringing electrostatic fields generated by indium tin oxide electrodes on top of the integrated circuit are used to induce the actual diffractive display. Within the individual IC die making up the display will be computational engines that compute the image to be displayed. Because grating information is encoded in the ITO gratings at the time of chip fabrication, the actual real-time computation is several orders of magnitude less than previous approaches. A large display may be formed by a tessellation of several hundred IC die, each approximately 1 cm2, on a flat substrate. An optical broadcast system would be used to transfer imagery information into the integrated circuits, obviating the need for wire bond attachments. This paper presents details of the overall architecture of the display system, and details of the holographic grating computations.