The chromaticity of navigation lights are defined by areas on the International Commission on Illumination (CIE) 1931 chromaticity diagram. The corner coordinates for these areas are specified in the International Regulations for Prevention of Collisions at Sea, 1972 (72 COLREGS). The navigation light’s color of white, red, green, and yellow are bounded by these areas. The chromaticity values specified by the COLREGS for navigation lights were intended for the human visual system (HVS). The HVS can determine the colors of these lights easily under various conditions. For digital color camera imaging systems the colors of these lights are dependent on the camera’s color spectral sensitivity, settings, and color correction. At night the color of these lights are used to quickly determine the relative course of vessels. If these lights are incorrectly identified or there is a delay in identifying them this could be a potential safety of ship concern. Vessels that use camera imaging systems exclusively for sight, at night, need to detect, identify, and discriminate navigation lights for navigation and collision avoidance. The introduction of light emitting diode (LED) lights and lights with different spectral signatures have the potential to be imaged very differently with an RGB color filter array (CFA) color camera than with the human eye. It has been found that some green navigation lights’ images appear blue verse green. This has an impact on vessels that use camera imaging systems exclusively for navigation. This paper will characterize color cameras ability to properly reproducing navigation lights’ color and survey a set of navigation light to determine if they conform to the COLREGS.
Optical metamaterials promise aberration free and better than diffraction limited performance for imaging systems through constructed materials made to regulate the interaction with electromagnetic waves. Optical metamaterials have the potential to miniaturize the optical bench and obtain diffraction-limited performance with a single device. The reduction of size, weight, and complexity of optical systems while maintaining performance is desired. In unmanned aircrafts, buoy systems, 360 degree imaging systems, and optronic or traditional periscope systems the lenses constitute a considerable percentage of the weight and volume. Another characteristic that is desired is optical cross section reduction for both visible and infrared bands. Optical cloaking using metamaterials has the potential to make objects indiscernible from its environment by masking objects signature. Other characteristics that are desired are materials that are perfect light absorbers for stray light baffles, detectors, or solar energy harvesting, nonlinear frequency conversion for photonics devices, and lenses or head window coatings to achieve specific properties. These topics are discussed in this paper.
The atmospheric environment can significantly affect radio frequency and optical propagation. In the RF spectrum refraction and ducting can degrade or enhance communications and radar coverage. Platforms in or beneath refractive boundaries can exploit the benefits or suffer the effects of the atmospheric boundary layers. Evaporative ducts and surface-base ducts are of most concern for ocean surface platforms and evaporative ducts are almost always present along the sea-air interface. The atmospheric environment also degrades electro-optical systems resolution and visibility. The atmospheric environment has been proven not to be uniform and under heterogeneous conditions substantial propagation errors may be present for large distances from homogeneous models. An accurate and portable atmospheric sensor to profile the vertical index of refraction is needed for mission planning, post analysis, and in-situ performance assessment. The meteorological instrument used in conjunction with a radio frequency and electro-optical propagation prediction tactical decision aid tool would give military platforms, in real time, the ability to make assessments on communication systems propagation ranges, radar detection and vulnerability ranges, satellite communications vulnerability, laser range finder performance, and imaging system performance predictions. Raman lidar has been shown to be capable of measuring the required atmospheric parameters needed to profile the atmospheric environment. The atmospheric profile could then be used as input to a tactical decision aid tool to make propagation predictions.
Multispectral polarized light imaging (MSPLI) enables rapid inspection of a superficial tissue layer over large surfaces, but does not provide information on cellular microstructure. Confocal microscopy (CM) allows imaging within turbid media with resolution comparable to that of histology, but suffers from a small field of view. In practice, pathologists use microscopes at low and high power to view tumor margins and cell features, respectively. Therefore, we study the combination of CM and MSPLI for demarcation of nonmelanoma skin cancers. Freshly excised thick skin samples with nonmelanoma cancers are rapidly stained with either toluidine or methylene blue dyes, rinsed in acetic acid, and imaged using MSPLI and CM. MSPLI is performed at 630, 660, and 750 nm. The same specimens are imaged by reflectance CM at 630, 660, and 830 nm. Results indicate that CM and MSPLI images are in good correlation with histopathology. Cytological features are identified by CM, and tumor margins are delineated by MSPLI. A combination of MSPLI and CM appears to be complementary. This combined in situ technique has potential to guide cancer surgery more rapidly and at lower cost than conventional histopathology.
The Quadrature Tomographic Microscope measures the amplitude and phase of an image. This information allows the user to see contrast features not available in other microscopes, and is critical to any three-dimensional reconstruction. We report on development and use of test objects to measure the accuracy and repeatability of phase measurements. A simple binary phase grating, a series of glass beads, and preimplantation mouse embryos were used in these experiments. The gratings were fabricated on high-quality fused-silica substrates whose transmission phase error was determined to be less than one-tenth wave error across their 25 mm diameter before fabrication. The phase step of the binary phase grating was measured using both the optical quadrature technique and the usual fringe-counting techniques applied to the raw data. Phase unwrapping techniques were validated by measuring the diameter of glass beads of a known size. Results are presented showing that the phase measurements agree with each other, with the known data, and with the spatial resolution in preimplantation mouse embryos. More complicated objects will be fabricated in the future to validate 3-D imaging techniques.