|
1.Bench Alignment Monitoring System OverviewBench Alignment Monitoring System (BAMS) was designed to detect misalignments between the optics and detectors on the High-Energy Replicated Optics to Explore the Sun (HEROES) payload, and the system was required to fit into the existing optical bench without interfering with the telescope operation. Misalignments were required to be detected to a precision of 30 arc sec or better over a payload elevation of 0 deg to 68 deg. The BAMS system as mounted on the HEROES payload is shown in Fig. 1 and consists of two CCD cameras attached to the elevation flange of the optical bench, with one camera facing the optics flange and one facing the detector flange. The flanges are the structures to which the x-ray optics and detectors are attached. The BAMS cameras have a pixel size of , filling a total sensor size of ½-in. diagonal.1 The optics end of the payload has one light-emitting diode (LED) ring attached to the flange as well as LED rings attached to three of the mirror modules. The detector flange has a single LED ring attached to it. The LEDs were selected to have a peak wavelength of 950 nm (to be distinguished from stray visible light from outside sources) and a relatively small diameter of 1.8 mm. This small diameter yields a correspondingly small image on the CCDs, providing good alignment position sensitivity via the joint transform correlator (JTC). The CCD cameras have 16 mm fixed focal length lenses with UV–Vis cutoff filters attached to reject wavelengths below 720 nm. The camera angular resolution in the HEROES configuration is calculated as A pixel size of and a focal length of 16 mm inserted into Eq. (1) yield a resolution of 59.95 arc sec/pixel ( arc min/pixel) for the 6-m HEROES telescope.2 A method for locating the centroid of the correlation signal (described later) provides reliable subpixel resolution. The LED rings shown in Fig. 2 were fabricated in-house at Marshall Space Flight Center (MSFC) using a three-dimensional (3-D) printer. The LEDs were arranged in an asymmetric and nonrepeating pattern on the rings to allow the BAMS software to determine the rotational alignment of the ring. The LEDs were powered to provide continued operation even if one (or more) LED fails. The current to the LED rings was regulated, as a fluctuating current can cause the intensity of the LEDs (and resulting image size on the CCD) to vary. This variation in intensity results in a changing spot size that can adversely affect the precision of the misalignment determination because the correlation is not scale invariant. An example of this is shown in Table 1, where only the voltage was increased for one of the LED rings. As expected, a voltage increase causes the error in the recorded position to also increase, for the reasons just mentioned. Table 1Change in misalignment information with increasing voltage.
The onboard HEROES computer records an image from the BAMS cameras every 3 s. This was not a requirement but was done to prevent the hard drive onboard the payload from running out of space. These images are processed by the BAMS software post-HEROES flight to determine if any movement of the LED rings occurred. Real-time operation (with real-time alignment correction) is possible but has not yet been implemented due to challenges presented by the additional onboard processing requirements. Figure 3 is an image of the optics flange taken during the HEROES flight with all four LED rings operational. Each of the LED rings was clocked to different orientations to allow for differentiation. This ensures that BAMS is able to track any specific ring without interference from the other rings in the camera’s field of view. 2.Joint Transform Correlator BackgroundThe full complex (amplitude and phase) digital optical correlator takes advantage of a number of mathematical operations, but specifically the Fourier transform. The total information contained in the original function (scene) is still contained in the transformed version; it is simply redistributed, and it is this redistribution that allows a more efficient interrogation by another function through a simple multiplication. In this case, the multiplication is complex, containing real and imaginary terms representing the individual amplitude and phase components. In the purely optical JTC, both the known and test scenes are displayed in the same plane and encoded in coherent light, usually with the use of a spatial light modulator (SLM). A single lens Fourier transforms the sum of the two scenes. This coherent sum of the transforms is detected (magnitude squared) by a camera, and the image is then displayed on a second SLM that is also illuminated with coherent light. After being Fourier transformed by another lens, two identical diffraction orders result and each one corresponds to the correlation between the two scenes.3 The BAMS software performs the Fourier transforms digitally, making the present JTC implementation more rugged and controllable. The JTC preserves the translation-invariant property of the Fourier transform, but the architecture has an inherent high sensitivity to other factors, such as scale and rotation.4 The fact that the JTC is not rotation invariant is actually useful for detecting the rotational alignment of the test image with respect to the reference. The JTC also has a strong dependence on the illumination of (or light emitted by) the test object and the reference object. The joint transform correlation signal () will vary with the intensity ratio of the test scene () and reference scene () as described by the following equation:5 which illustrates the importance of maintaining the ratio in the equation as close to unity as possible. In the digital implementation presented here, the intensity (pixel values) of the reference (template) can be adjusted to closely match that of the live scene (target) captured by the camera. Another factor that can affect the performance of the JTC is an inherent nonlinearity in the system. For example, if the test image wanders far off-axis, there will be an error present (analogous to curvature of field), which is symptomatic of the CCD sensor surface being flat rather than curved. With only small deviations being observed in the HEROES system, the error from this nonlinearity is minimal.63.Bench Alignment Monitoring System Software OverviewThe BAMS program was developed to digitally mimic the optical JTC technique described in Sec. 2. A screen shot of the BAMS graphical user interface (GUI) is shown in Fig. 4. The black area is the area of interest (AOI) and contains the optical target image (TI), which the user can also choose as a template (the reference image). The JTC uses the template and the TI to determine relative misalignment. The user can manually select a different template in the software and display the relative displacements of the TI in both arc minutes and millimeters. A history file is utilized so that the movement of the TI can be recorded. The JTC software includes, besides two fast Fourier transforms, other pixel-level processes that are all well-suited to the parallel processing capabilities offered by an NVidia graphics processing unit (GPU).7 This parallel processing capability drastically decreases the processing time for each image. A flowchart of the correlation operation as implemented in the GPU is shown in Fig. 5 and is described below. A static copy of the reference image of interest (the template) is introduced into the GPU where it is rotated by a user-controlled dynamically specified amount and then merged into the current camera scene that contains the TI by means of a logical “OR” operation. The pixels of the template (after rotation) are merged with the current camera scene, and the resulting pair of images is Fourier transformed using commercial off-the-shelf software known as the fastest Fourier transform in the West (FFTW), which is available from NVidia.8 The results of the transform are manipulated so that the low spatial frequencies are in the center of the transform rather than at the corners where the FFTW leaves them. The results of a typical first transform can be seen in the topmost image given in Fig. 5. In this image, interference fringes (or rather their digital analog) can be seen. The result of the first (joint) transform is then Fourier transformed, and the result of the second transform is formatted to ensure that the low spatial frequencies are again in the center of the transform. The reformatted result of the second transform consists of three groups of bright pixels (typically the brightest pixels in the transform) surrounded by pixels of lower amplitude, such as those shown in the lower image in Fig. 5. One of the identical (off-axis) cross correlations, after suitable filtering, is used to determine alignment of the two scenes. The unused on-axis (DC) bright pixels correspond to autocorrelations. A 3-D map of a typical cross correlation is shown in Fig. 6. The BAMS software uses the correlation procedure described above for both angle and linear alignment. When the template and the TI approach angular alignment, the grayscale value of the correlation peak increases. This angular alignment approach can be done because only scale and rotation can decrease the signal and here the scale is fixed. The distance between the template and the TI determines the relative separation between the template (reference) and the target. As the distance between the two goes to zero, the correlation will move to the center of the second transform. The flow diagram for this process is shown in Fig. 7. The BAMS program is designed to measure differential deviations referenced to an initial alignment. When the BAMS program is initiated, the initial pass searches for the TI within the camera frame, sampling overlapping areas of 512 pixels square. The initial pass will search the entire CCD array for the target. For each sampling region, the correlation operation is excuted for a series of rotation angle steps, with the rotation angle sector step having the brightest pixel being saved. The initial tracking function determines both the 512 pixel square portion of the camera image containing the TI and the user determined coarse (typically, ) rotation angle. Once the location and the coarse rotation angle of the TI are known, BAMS performs a fine-angle alignment, rotating the template in 0.1-deg steps. The changes in pixel amplitude as the template is rotated in these small steps yields a TI angle with a precision on the order of . alignment is performed by moving the AOI to obtain a precise separation between the template and the TI in both the - and -directions. The algorithm does not attempt to superimpose the template onto the TI. The system is unable to determine whether the TI is to the right or the left of the template, so two executions of the program are required to determine the polarity of the separation in the -direction. The offset between the template and the TI in the -direction is also determined at this time. Finally, the JTC operation is executed once more and the centroid of the correlation function is determined. The centroid positions and are calculated using the following equations: where is the grayscale value of the pixel at the corresponding and positions on the CCD.9 The correlation signal in the second transform extends over several pixels, so the computed centroid of the signal yields a “center of mass” pixel position. The results for the second transform are squared, so the central peaks will contribute much more to the centroid and any small fluctuations will not significantly add to the reported position. This position is not reported in integer pixels but rather in fractions of a pixel (of the correct position), which is the subpixel resolution referred to earlier. By comparing the rotation angle and correlation centroid position with the rotation angle and correlation centroid saved as a baseline, a precise deviation in both angle and position of the TI is obtained.4.Experimental DataThe BAMS software was validated experimentally in a controlled laboratory environment prior to the HEROES launch. A CCD camera (the same model used on the HEROES payload) was mounted on an optical mount . away from an LED ring. The ring was moved using a Newport (model esp301) motion controller stage with a motorized linear actuator in steps of 0.773 mm or 1 arc min. Table 2 shows the BAMS software output for each ring position and the error in position of the measured value. The standard deviation in position from the BAMS software was 0.0177 arc min. This uncertainty in position includes systematic errors in the actuator and in the distance measurement from the camera to the LED ring, as well as errors associated with LED stability and centroiding calculations. Table 2Laboratory test of the BAMS architecture and software: translational alignment.
Following multiple laboratory benchtop tests, BAMS was installed on the HEROES payload and tested during a hang test in Ft. Sumner, New Mexico. During the hang test, the HEROES payload was suspended by a crane in a high bay so that a controlled test could be performed to ensure that all components of the payload were operating correctly prior to launch. BAMS was able to successfully track the changes in alignment due to gravitational sag in the 6-m-long optical bench as a function of elevation angle. A photograph of HEROES during the hang test is shown in Fig. 8 with the components of the HEROES payload labeled. BAMS results of the hang test are plotted in Fig. 9. The displacements are from a nominal starting position for the optics flange as the payload was cycled through the full range of elevation angles (0 deg to 68 deg) three times. Modeling the effects of gravitational sag on the correlation for the complex HEROES mechanical system will be done prior to the next flight. Figure 9 shows the - and -axis displacements plotted as a function of the payload elevation. With any large optical system, there are residual stresses in the mechanical components that can affect performance until they are relieved. This can be seen in the -axis as the payload was cycled through the full elevation. There is an upward trend in the data due to these stresses being relieved. Although the payload was only moved in elevation, there is also movement in the -axis, which could be due to a misalignment. 5.ConclusionAn alignment monitoring system using a digital JTC architecture has been developed and initially tested for the purpose of obtaining flight payload alignment information. BAMS has the ability to detect misalignment for multiple components in a flight payload by continuously observing rings of LEDs attached to each item to be monitored. The system was required to obtain alignment information with a precision of 30 arc sec or better. BAMS was able to obtain substantially better than this by centroiding the second transform correlation signal to provide subpixel resolution. The JTC signal’s translational invariance allowed relative translational movement of each LED ring to be determined; however, the correlation is not rotationally invariant. This is used to an advantage in that the gradual decay in signal also allowed the rotational alignment of the LED rings to be determined. The JTC is inherently not scale invariant either, but in the present application, the scale is fixed. Experimental data show that the BAMS system as configured is currently able to produce translational alignment information with a resolution of 1.06 arc sec. Analysis of actual flight data has begun, and those results will be reported in a forthcoming paper. References, “DMK 41BU02,”
(2016) http://www.theimagingsource.com/en_US/products/cameras/usb-ccd-mono/dmk41bu02/ August ). 2016). Google Scholar
J. L. Jenkins, The Sun and How to Observe It, 130 Springer Science, New York
(2009). Google Scholar
J. W. Goodman, Introduction to Fourier Optics, 239
–247 Roberts and Company Publishers, Greenwood Village
(2007). Google Scholar
D. Mendlovic, E. Marom and N. Konforti,
“Complex reference-invariant joint-transform correlator,”
Opt. Lett., 15 1224
–1226
(1990). http://dx.doi.org/10.1364/OL.15.001224 OPLEDP 0146-9592 Google Scholar
D. A. Gregory, J. A. Loudin and F. T. S. Yu,
“Illumination dependence of the joint transform correlation,”
Appl. Opt., 28 3288
–3290
(1989). http://dx.doi.org/10.1364/AO.28.003288 APOPAI 0003-6935 Google Scholar
D. A. Gregory, J. C. Kirsch and J. L. Johnson,
“Optical correlator tracking nonlinearity,”
Appl. Opt., 26 192
–194
(1987). http://dx.doi.org/10.1364/AO.26.000192 APOPAI 0003-6935 Google Scholar
B. Daga, A. Bhute and A. Ghatol,
“Implementation of parallel image processing using NVIDIA GPU framework,”
Adv. Comput. Commun. Control, 125 457
–464
(2011). http://dx.doi.org/10.1007/978-3-642-18440-6 Google Scholar
M. Frigo and S. G. Johnson,
“The design and implementation of FFTW3,”
Proc. IEEE, 93
(2), 216
–231
(2005). http://dx.doi.org/10.1109/JPROC.2004.840301 IEEPAD 0018-9219 Google Scholar
C. Zhai et al.,
“Micro-pixel accuracy centroid displacement estimation and detector calibration,”
Proc. R. Soc. A, 467
(2136), 3550
–3569
(2011). http://dx.doi.org/10.1098/rspa.2011.0255 PRLAAZ 1364-5021 Google Scholar
BiographyTomasz Lis earned his BS degree in physics from the University of Alabama in Huntsville (UAH) in 2012 and currently is working on his PhD with a specialization in optics. He has been enrolled in the NASA MSFC pathways program since 2012. His dissertation will involve the HEROES Alignment Monitoring System development. Jessica Gaskin is a physicist in the x-ray astronomy group at NASA Marshall Space Flight Center, Huntsville, Alabama. She has supported the High-Energy Replicated Optics (HERO) hard x-ray balloon-borne telescope and was the coprinciple investigator of HEROES. She is the study scientist leading a team of researchers on a mission to design x-ray surveyor. She has a PhD in physics from the UAH. John Jasper has 57 years experience in all phases of software development, integration, and testing. He has designed and implemented radar-tracking software, data link processing software, and distributed air defense system architectures. Most recently, he designed and coded embedded (PC104) applications, providing compression and remote display of forward looking infrared imagery in support of the Chaparral Air Defense System. He is supporting the software development for the Alignment Monitoring Systems for HEROES. Don A. Gregory is professor of physics at the UAH and has more than 100 refereed open-literature technical publications in internationally circulated journals in fields ranging from basic physics and optics to materials and advanced propulsion. He has 13 US patents and has been the recipient of the Department of the Army Research and Development Award. He supports the Alignment Monitoring Systems development for HEROES. |