A main contributory factor toward the ability to achieve high time resolution with various imaging devices has been the development of fast high voltage pulse generators. These generators are used for remote ramp generators in streak cameras, for driving blanking circuitry, and for gating image tubes. Pulse generators based mainly upon avalanche technology are now capable of sub 100 ps rise times to voltages greater than 10 kV. Repetition rates are now sufficiently high that significant signal averaging can be performed in suitable applications.
To simultaneously improve the spatial and temporal performances of streak tubes, we have developed bilamellar x-ray streak tubes with the help of Philips Components (now Philips Photonics). We describe the structure of one of them, the P850X, and the prototype of the camera which uses it, the C850X. We used 80 fs pulse dye lasers developed at LOA (ENSTA) to characterize this instrument in UV and x-ray range. The results show that its time resolution is less than 2 ps even with 1.5 keV energy photons and its spatial resolution much better than 16 lp/mm. This instrument was used for studying the time evolution of x-ray emission from 80 fs and 1 ps laser created plasmas at very high intensities.
In order to produce a streak tube operated easily, we have developed a new streak tube having ultra-high deflection sensitivity based on a new design concept. Both the low photocathode voltage of -2 kV referred to the anode and the long deflection plate of 63 mm made it possible to achieve the deflection sensitivity of 670 mm/kV. Furthermore, the temporal resolution of 2 ps was achieved by applying a high electric field of 2 kV/mm between the photocathode and the acceleration mesh electrode.
The resolution of microchannel-plate image intensifiers (MCPIIs), when specified by manufacturers, is usually only measured while the MCPII is biased on with dc voltages (dc mode). We present data on the resolution of two proximity focused MCPIIs under both dc bias and pulsed voltage operation (gated mode). The resolution is determined by recording a digital image of the MCPII's output for an input in the form of a spatial impulse. By calculating the Fourier transform of the two-dimensional output images distribution (the output point spread function), the modulation transfer function can be found. The effects of input optical energy density, gate repetition rate, gate width, and input position on the resolution of a standard ITT MCPII and a high strip current ITT MCPII are studied.
Quantitative measurement of X irradiated object requires calibration of the film. When the scattered radiation level is low enough, we suggest calibration of the film by means of a stepwedged series of holes carved out in the collimator periphery. In this paper we compare this with other methods.
When a slit target is projected on an imaging element for determination of optical system MTF, chromatic aberration in the optoliner lens may cause the projected image pattern to broaden appreciably. This paper presents results of MTF measurements in which chromatic aberration is removed through appropriate bandpass filtering.
Basic designs of an ultra high-speed multiframing camera and an ultra high-speed video camera are presented. The proposed video camera has a frame rate of 3 X 107 pps, while the recordable number of frames and the linear resolution are 96 frames and 128 pixels, respectively. The 96 successive frames can be replayed at 5 pps over about 20 seconds. Although motion at 5 pps looks somewhat intermittent, the frame rate is high enough for recognition of continuous movement.
Optical time-domain reflectometry (OTDR) is a simple and rugged technique for measuring quantities such as strain that affect the propagation of light in an optical fiber. For engineering applications of OTDR, it is important to know the repeatable limits of its performance. The authors constructed an OTDR-based, submillimeter resolution, strain measurement system from off-the-shelf components. The system repeatably resolves changes in time of flight to within +/- 2 ps. Using a 1 m, single-mode fiber as a gauge and observing the time of flight between Fresnel reflections, we observed a repeatable sensitivity of 400 microstrains. Using the same fiber to connect the legs of a 3 dB directional coupler to form a loop, we observed a repeatable sensitivity of 200 microstrains. Realizable changes to the system that should improve the repeatable sensitivity to 20 microstrains or less are discussed.
The current major limiting factor in digital optical computing is the absence of a fast, efficient, cascadable optical switch. Material and processing approaches to optical switch construction are not limited to the well known semiconductors and fabrication methods of electronics. A facility to evaluate candidate devices has been constructed. Multiline and tunable femtosecond and picosecond laser systems, as well as frequency mixing systems, are used as light sources. The facility has at least picosecond source capability from 200 nm to 2 micrometers . The switch transfer function is evaluated in a pump-probe system with femtosecond and picosecond autocorrelators to measure temporal properties, an optical multichannel analyzer to measure spectral properties, a CCD or pyroelectric camera system to measure mode modification, and a multi-detector system to measure switching energy and insertion loss both in absorption and in reflection. The switch or switching array under test is mounted in a 6 axis micropositioner system with a 0 - 20 goniometer, x, y, and z translators, and a tilt goniometer. The system's design, as well as initial measurements of nonlinear interface optical switches based on photorefractive thin films are presented.
High-accuracy absolute timing of optical pulses is usually performed by PMTs, in conjunction with advanced discriminator and time-measurement electronics. This paper presents a new technique that is capable of less than 1.0 psec absolute timing measurements, using Si APDs, with significantly relaxed requirements for the discriminator and timing electronics. An EO modulator is used to modulate the polarization of the optical pulse at a microwave frequency (e.g., 8 GHz), and the two resulting polarization components are detected by the APDs. Relative signal levels at the two polarizations provide a vernier timing measurement with respect to the phase of the microwave modulating signal, and the time ambiguity is resolved by a relatively coarse timing measurement of the detected signals. This technique offers the potential for simpler, lower-cost implementations than PMT and/or streak camera systems, and it is applicable to a wider range of wavelengths, particularly near-IR systems. This paper discusses the general principles and design issues of the technique, and describes a particular design implementation applicable to 2-color lidar ranging.
A collimated laser diode associated with a small, short focal length objective lens produces a focused laser beam on the top of a sample glued onto a piezoelectric transducer. This laser beam is horizontally scanned on the surface and its intensity is modulated by a square wave using a TTL signal generator. This system induces acoustic waves in the sample. With a specially designed control circuitry, by combining this acoustic signal and the scanned laser beam, imaging of the subsurface is possible. The transient analysis developed is described and we show how to select cut-away views of the subsurface specimen with some applications in failure analysis of integrated circuits. We present the apparatus, the transient photoacoustic signal theory, and make a comparison between scanning photoacoustic (SPAM) and scanning electron acoustic microscopy (SEAM).
Optical imaging of neural network activity using voltage sensitive dyes is expected to be easier than the current electrode method to get multiple point information simultaneously. A high- speed camera is needed to capture a change of a fluorescence image caused by a potential change accompanying neuronal activity. The electrical excitation of a neuron lasts for a few milliseconds and the concomitant fluorescence change is reported to be a few percent or less of its total intensity. A 50 X 50 photodiode camera is specifically designed for this purpose. This camera can take pictures with a 12 bit accuracy every 256 microsecond(s) for about 1 sec(4096 pictures). The epi-illuminated fluorescence image through an objective lens is focused on the end-plate of a 50 X 50 plastic optical fiber bundle. Individual photodiodes and ac-coupled amplifiers are connected to each optical fiber. The time-dependent portion of each signal is transferred to one of the ten 12-bit AD converters through a three-stage multiplexer, and digitally stored in a 20 MBytes RAM. We demonstrated the ability of the camera by capturing membrane potential images of sea urchin eggs induced by an external electric field.
The system has been designed for instant sports replay. The passive unit utilizes two video cameras, an image processor, and a graphics computer to track the baseball pitch and provide an instant graphic replay of the pitch, showing the ball's trajectory, speed, and movement. Shown on the 1991 World Series, it has applications for both team training and game broadcasting as well as other sports.
The goal of sport biomechanists is to provide information to coaches and athletes about sport skill technique that will assist them in obtaining the highest levels of athletic performance. Within this technique evaluation process, two methodological approaches can be taken to study human movement. One method describes the motion being performed; the second approach focuses on understanding the forces causing the motion. It is with the movement description method that video image recordings offer a means for athletes, coaches, and sport biomechanists to analyze sport performance. Staff members of the Technique Evaluation Program provide video recordings of sport performance to athletes and coaches during training sessions held at the Olympic Training Center in Colorado Springs, Colorado. These video records are taken to provide a means for the qualitative evaluation or the quantitative analysis of sport skills as performed by elite athletes. High-speed video equipment (NAC HVRB-200 and NAC HSV-400 Video Systems) is used to capture various sport movement sequences that will permit coaches, athletes, and sport biomechanists to evaluate and/or analyze sport performance. The PEAK Performance Motion Measurement System allows sport biomechanists to measure selected mechanical variables appropriate to the sport being analyzed. Use of two high-speed cameras allows for three-dimensional analysis of the sport skill or the ability to capture images of an athlete's motion from two different perspectives. The simultaneous collection and synchronization of force data provides for a more comprehensive analysis and understanding of a particular sport skill. This process of combining force data with motion sequences has been done extensively with cycling. The decision to use high-speed videography rather than normal speed video is based upon the same criteria that are used in other settings. The rapidness of the sport movement sequence and the need to see the location of body parts of the athlete, particularly at critical positions during the motion, serve as guidelines for choosing high-speed videography. Certain requirements for high-speed videography need to be considered in selecting appropriate equipment. These include portability, genlock capability, weight, sturdiness, and costs.
NASA Lewis Research Center in Cleveland, Ohio has just completed the celebration of its 50th anniversary. `During the past 50 years, Lewis helped win World War II, made jet aircraft safer and more efficient, helped Americans land on the Moon ... and engaged in the type of fundamental research that benefits all of us in our daily lives.' As part of the center's long history, the Photographic and Printing Branch has continued to develop and meet the center's research imaging requirements. As imaging systems continue to advance and researchers more clearly understand the power of imaging, investigators are relying more and more on imaging systems to meet program objectives. Today, the Photographic and Printing Branch supports a research community of over 5,000 including advocacy for NASA Headquarters and other government agencies. Complete classified and unclassified imaging services include high- speed image acquisition, technical film and video documentaries, still imaging, and conventional and unconventional photofinishing operations. These are the foundation of the branch's modern support function. This paper provides an overview of the varied applied imaging programs managed by the Photographic and Printing Branch. Emphasis is placed on recent imaging projects including icing research, space experiments, and an on-line image archive.
The Utah Test and Training Range (UTTR) is a DoD test range in western Utah that contains 2,600 square miles of DoD withdrawn land and 17,000 square miles of usable airspace. The Range is managed by the 650 1st Range Squadron at Hill Air Force Base, Utah. The squadron's Range Operations Branch, located at Dugway Proving Grounds, Utah, provides photo-optic time-space position information (TSPI) on unmanned air vehicles, low flying missiles, on-board ordnance delivery systems, and other air targets using cine/videotheodolites. The Branch also provides videometric analysis recording, long-duration high-speed film recording, multiple camera coverage of explosive propagation tests, and video documentation services. The Range's capabilities are being used to support the Cruise Missile programs (ALCM/ACM), Tactical Weapons System Evaluation Program (TAC-WSEP), air and ground launched Medium Range Unmanned Air Vehicles, rocket motor and munitions shelf life testing, and explosive hazard classification studies. The purpose of this paper is to describe, in general terms, some of the imaging equipment and techniques used on the UTTR.
Military and commercial test facilities have been tasked with the need for increasingly sophisticated data collection and data reduction. A state-of-the-art electronic control system for high speed 35 mm and 70 mm film cameras designed to meet these tasks is described. Data collection in today's test range environment is difficult at best. The need for a completely integrated image and data collection system is mandated by the increasingly complex test environment. Instrumentation film cameras have been used on test ranges to capture images for decades. Their high frame rates coupled with exceptionally high resolution make them an essential part of any test system. In addition to documenting test events, today's camera system is required to perform many additional tasks. Data reduction to establish TSPI (time- space-position information) may be performed after a mission and is subject to all of the variables present in documenting the mission. A typical scenario would consist of multiple cameras located on tracking mounts capturing the event along with azimuth and elevation position data. Corrected data can then be reduced using each camera's time and position deltas and calculating the TSPI of the object using triangulation. An electronic camera control system designed to meet these requirements has been developed by Photo-Sonics, Inc. The feedback received from test technicians at range facilities throughout the world led Photo-Sonics to design the features of this control system. These prominent new features include: a comprehensive safety management system, full local or remote operation, frame rate accuracy of less than 0.005 percent, and phase locking capability to Irig-B. In fact, Irig-B phase lock operation of multiple cameras can reduce the time-distance delta of a test object traveling at mach-1 to less than one inch during data reduction.
The continuing increased reliance on high speed shuttered color video cameras for instrumentation applications has resulted in a data synchronization discrepancy between the video data files and most other instrumentation system files. The color video systems based on the National Television Standards Committee (NTSC) system have a 59.94 Hz time base while Inter-Range Instrumentation Group (IRIG) timing based systems are typically clocked on even fractions of 60 Hz. This paper describes the definition of a new data correlation technique compatible with both NTSC and IRIG timing sources and the development of a prototype airborne encoder to demonstrate its use.
A brief background is presented leading to a list of four reasons for synchronizing video equipment to IRIG timing. The top level specifications, performance, implementation methods, and equipment compatibility are discussed.
Until the present date, free flight spark ranges have used conventional film technology for recording the position-attitude histories of projectiles as they traversed the instrumented ranges. Film solutions provided precise position-attitude data, but not without limitations. Film requires substantial range set-up, processing, and analysis time. Days or weeks could pass before a given experiment would have its data reduced for interpolation. Such time delay, and its attendant inefficiencies, could be greatly reduced by using a CCD camera of proper resolution operating in conjunction with a compatible data collection system. This capability would create an electronic shadowgraph system. The purpose of this paper is to describe the camera and data collection system first tested at the free-flight range at Eglin Air Force Base in February of 1992.
Many imaging applications require quantitative determination of a scene's spectral radiance. This paper describes a new system capable of near real-time spectroradiometric imagery. Operating at a full-spectrum update rate of 30 Hz, this imager is capable of collecting a 20 point spectral image from 400 nm to 700 nm, with a 10 nm bandwidth, over an image of 256 X 192 pixels. At a slightly reduced update rate of 20 Hz, 30 point spectra can be collected. A slower update version is available with extended coverage to 900 nm. Although this full scene information is available, to make such a tremendous amount of data more manageable, internal processing electronics compute in real time the tristimulus integrals X, Y, and Z; along with standard RGB, these colorimetric integrals are available either as tristimulus values, or as chromaticity coordinates x, y, and Y. To allow the imager to simulate sensors with many different spectral responses, any arbitrary response function may be loaded into the imager including delta functions to allow single wavelength viewing. Understanding that some applications may require better spectral resolution than 10 nm, a separate processing section allows resolution enhancement to about 1 nm with 1,000 point spectra available from 256 pixels throughout the scene. These enhanced spectra are available at a .3 Hz update rate, limited by the readout rate of the imaging array. The luminous dynamic range of the instrument is about 1 cd/m2 to 105 cd/m2. The unique challenges of design and calibration are described. Pixel readout rates of 30 MHz, for full frame readout rates of 600 Hz present the first challenge, with the processing rate of 300+ million integer operations per second presenting the second. Spatial and spectral calibration of 50,000 pixels and 2,000 spectral positions mandate novel decoupling methods to keep the required calibration memory to a reasonable size. Large luminous dynamic range also requires care to maintain precision operation with minimum memory size.
A series of experiments was conducted over the past three years to prepare NASA for the use of high-definition television. In 1989 and in 1990, HDTV technology was evaluated for potential use in launch operations, real-time image analysis, and media dissemination at the Kennedy Space Center (KSC). Evaluation of camera and lens performance is reported here. In November 1991, an experiment was done at the Johnson Space Center (JSC) to evaluate the quality of HDTV that was digitized, compressed to a 45 Mbps data stream, and transmitted through the NASA communications network. The JSC experiment consisted of back-to-back bench tests of the Alcatel/Telettra high-definition coder/decoder (codec), followed by data transmission through the NASA Shuttle communications simulator, and most importantly, actual transmission through the NASA Tracking and Data Relay Satellite System (TDRSS), with a second satellite hop through a domestic satellite and a fiber-optic link at JSC. Static and dynamic test signals were used to test codec performance as were various types of subjective- test scenes with detail and motion. Included in the subjective material was IMAX film shot in space and transferred directly to high-definition video at 30 frames/second. Static tests highlighted the effects of the 54 MHz sampling rate in the codec. Color reproduction tests showed very little color error, even when transcoding externally from GBR signals. Dynamic test signals characterized the DCT and motion-compensation algorithm. Frame-by-frame analysis showed a small reduction in horizontal resolution, small color errors in fine detail, and reduced horizontal and vertical resolution immediately following transitions, where the effect was almost entirely masked by the transitions. Subjective codec performance on moving images at nominal TDRSS bit-error-rates (BER) was extremely good. The codec designers have done a very good job of leaving out information that is not perceived while including almost all information that is needed. Expert viewers, trained in image analysis, gave excellent ratings to the system at typical TDRSS signal levels.
A PC-based programmable solid-state imager test station has been designed and is in final development phases. It is designed to provide a flexible universal high-speed platform for evaluation of different imager designs and formats including various multiport configurations. The system provides drive and acquisition circuitry and components to allow electro-optic characterization of imagers as a function of pixel readout rate. The data are scan-converted to RS-170 format for analysis. The system's functional capabilities and performance are presented. Examples of program code to generate three phase clocks for an 8-port Frame Transfer EEV CCD are included. A sampling of preliminary results obtained from variable rate clocking of this imager are discussed.
A new high-speed, high-resolution electrostatic deflection vidicon camera for use with high-definition diode electron gun focus projection and scan (FPS) vidicons has been designed and partially developed. It is asynchronously resettable to permit time-phased synchronization with randomly occurring transient optical events. Programmable rasters allow selection of nonstandard variable size and speed formats for specific imaging aspect ratios and readout requirements. Readout rates as high as 625 Hz (1.6 ms field period) with 256 scan lines per field with limiting resolution >= 20 lp/mm and dynamic range >= 200:1 from Plumbicon (PbO) and Saticon (Se-As-Te) targets are presented. Charge equilibrium problems resulting in superlinear transfer characteristics from transient low intensity pulsed illumination are discussed. Camera circuitry and potential scientific and medical applications are included.
With the introduction of the Kodak Ektapro Hi-Spec Motion Analyzer, a new set of capabilities and solutions is now available for demanding applications such as vehicle impact testing, ballistics, range testing, airborne systems testing, and mining operations. No longer constrained by the environmental and technical limitations of a tape-drive recorder, the ruggedized Hi-Spec Motion Analyzer utilizes a solid-state electronic memory module which is impervious to heavy shock, vibration, extreme temperatures, and other harsh conditions. The electronic memory also provides for many new and unique recording methods that are ideal for capturing uncontrolled events. This paper focuses on the analyzer's ruggedized design features and innovative recording capabilities as they apply to specific image capture scenarios.
We are demonstrating a high-speed video camera and recorder system that captures 8-bit image data at 500 frames per second and stores 500 frames of 640 x 480 pixel image data in solid-state digital memory. The base configuration contains 160 MBytes of dynamic RAM and can be expanded to 2.5 GBytes for 8000 frames of storage. The camcorder synthesizes an RS-170 output from the digital store. This standard video output can drive a monitor for viewing or drive a video cassette recorder for archiving the stored event. The camcorder is designed for enhancement to 2000 frames per second.
A new high-speed video system, the model HSV-500, has recently been introduced into the market of motion analysis. It is in itself a complete stand-alone motion analysis workstation which allows the operator to capture high speed motion and view it at a slower rate. The HSV- 500 has the ability of recording up to 43 minutes of video at a rate of 250 or 500 pictures per second and playback at variable rates displaying the imagery on high resolution NTSC or PAL color monitors.
A high frame rate visible CCD camera capable of operation up to 200 frames per second is described. The camera produces a 256 X 256 pixel image by using one quadrant of a 512 X 512 16-port, back illuminated CCD imager. Four contiguous outputs are digitally reformatted into a correct, 256 X 256 image. This paper details the architecture and timing used for the CCD drive circuits, analog processing, and the digital reformatter.
We have developed a large area short pulse framing camera that is capable of sixteen frames and shutter times of 40 ps per frame. This is accomplished with a high fidelity electrical circuit and a L/D equals 20 microchannel plate, driven by a short pulse (80 ps) high amplitude electrical driver. We show results of this work we have done to support this type of shutter time and the difficulties associated with large area high speed shuttering.
Evaluating the aerodynamic design of moving objects, such as ordnance, is often done using high-speed imagery on film or videotape. The imagery must then be analyzed to determine the position and orientation of the object in each frame; from this data, the trajectory, acceleration, and other parameters of motion can be computed. In the past, the object's location was determined by human operators locating a few points or fiducial targets in each frame. A system for analyzing film images was developed based on a high resolution digital camera and a high performance image processing computer which, using a whole-object model, precisely locates objects in six degrees-of-freedom (6 DOF). Model-based image processing has been developed that substantially automates a manual three-dimensional (3-D) visualization process using two- dimensional (2-D), short focal length imagery. The model-based image processing algorithm has seven levels: (1) edge extraction; (2) edge modeling; (3) association of extracted edges with model components and measurement of displacement of edge pixels from them; (4) estimation of straight lines fit to each set of edge pixels; (5) qualification of extracted displacement equations based on self and mutual consistency; (6) estimation of object parameters and error statistics from one image; and (7) estimation of object parameters, trajectory, and error statistics from multiple viewpoints. Examples illustrate these steps of the image processing algorithms.
A system for estimating the yaw, pitch, roll, and the three-dimensional position of a rigid body from a single camera view is described. In addition to estimating the six degrees of freedom (six DOF), the system can automatically track the six DOF in a sequence of images. The commercially available image analysis workstation hosting the six DOF algorithm and software is described briefly. An overview of the algorithm's automatic and manual tracking cycle is given. The sensor and scenario parameters influencing the estimation accuracy are discussed. An example of video-theodolite imaging is used to demonstrate the estimation error covariance model.
Previously, the mechanical analysis of biological systems has been a costly and labor intensive operation. Now, the noninvasiveness of video, combined with the inexpensive power of the personal computer, has led to a revolution in quantifying movement. Methods used to acquire kinematic and kinetic data include two-dimensional (2-D) and three-dimensional (3-D) manual and automatic point tracking, analog acquisition, and a variety of processing procedures.
Recent range requirements to furnish optical data on test programs are requiring remote control and high dynamic tracking capabilities. The modern missile systems have become more dynamic, long range, dispense many types of ordnances, obtain higher altitudes, and are smaller in size making optical tracking a challenge for test ranges. This has made manual acquisition difficult for data collection. To meet this challenge White Sands Missile Range (WSMR) has developed the Multi-Mode Automatic Tracking System (MATS), relying on a combination of remote control, automatic tracking, networking, and long auto focus lenses.
This paper describes the applications of the Time Space Position Information (TSPI) Data Processor (TDP), developed by the Wright Laboratory Armament Directorate, Instrumentation Technology Branch for optical platform tracking systems. The TDP will provide common hardware and software for real-time Kalman filtering to be used on DoD test ranges. The design will allow for easy integration of the TDP with existing range assets.
Rapid advancements in munitions and munitions delivery systems have created test scenarios that place extraordinary requirements on data collection with traditional sensor systems. The inability of sensor and data systems to keep pace with munitions development has spurred recent developmental activity in the area of electronic and optical sensor systems with real- time data collection and processing capabilities keyed to live munitions systems testing. The Vitro Services Corporation optical tracking system (OTS) provides multiple electro-optical sensors mounted on remotely controlled, highly mobile platforms linked to data processing shelters. This paper presents an overview of the OTS subsystems designed and fabricated by Vitro Services Corporation and a sampling of test results achieved in its operational configuration.
The modern test range requires a portable, highly accurate tracking mount to provide time space and position information (TSPI) for data analysis. The manufacture of such a mount requires: a design for accuracy and producibility, a meticulous attention to error reduction, and precision measuring techniques to evaluate true accuracy.
We have developed a prototype of a fast-scanning CCD readout system to record a 1024 X 256 pixel image and transport the image to a recording station within 1 ms of the experimental event. The system is designed to have a dynamic range of greater than 1000 with adequate sensitivity to read single-electron excitations of a CRT phosphor when amplified by a microchannel plate image intensifier. This readout camera is intended for recording images from oscilloscopes, streak, and framing cameras. The sensor is a custom CCD chip, designed by LORAL Aeroneutronics. This CCD chip is designed with 16 parallel output ports to supply the necessary image transfer speed. The CCD is designed as an interline structure to allow fast clearing of the image and on-chip fast sputtering. Special antiblooming provisions are also included. The camera is designed to be modular and to allow CCD chips of other sizes to be used with minimal reengineering of the camera head.