The development of satellites has permitted many optical experiments which hitherto were impossible. Our understanding of the world we live in has undergone a rapid and exciting development. It is my purpose in this report to discuss certain aspects of the optical radiation which would be seen by a satellite, with emphasis on what can be learned by looking at the earth itself. Illustrations will include a number of experiments carried out by members of our laboratory in the Aerospace Corporation.
None of the so-called atmospheric windows in the infrared are completely transparent, even in an atmosphere free of haze, fog, dust, or other particulate matter. Absorption by the atmospheric gases occurs because of many, very weak absorption lines in the windows and the extreme wings of very strong lines whose centers are located outside the windows. Most of the absorption lines between the visible and approximately 18 pm result from simultaneous changes in the vibrational and rotational energy levels of the absorbing molecules. At longer wavelengths most of the lines are pure rotational. A few pure rotational H2O lines appear at wavelengths as short as approximately 8 μm. (Ref. 1)
With the advancement of long duration of manned space flight and prolonged submergence of the nuclear powered submarine, research on the effects of atmospheric contaminants on human visual performance has been increased. Atmospheric ions could cause changes in efficiency of visual performance. The present paper is concerned with systematic review of articles on the effect of atmospheric ions on visual performances. In general, exposure to certain amount of negative ions enhance certain aspects of visual parameters such as brightness discrimination and motion detection, while certain amounts of positive ions deteriorate most visual performances. The need for the control of air ions in maintaining optimal environmental conditions has been suggested in this paper.
The successful deployment and operation of large aperture, diffraction limited orbital telescopes is a major goal of the space program. Such telescopes will combine significant light gathering power with angular resolutions and spectral coverage completely unachievable by earth-bound telescopes. To justify theircosts, such telescopes must combine multimode operational capabilities of great versatility with very high component reliability, to give extended operational lifetimes. For telescopes in the 1-meter-aperture range, operational lifetimes of 4 to 5 years are essential. For telescopes of substantially larger aperture, operational lifetimes measured in decades are desired.
To study the stars through a large telescope above the earth's atmosphere has long been the dream of the astronomer. This dream is now in the process of coming true. The expectation of brilliant discoveries was supported by the results of early balloon and rocket flights, such as the unprecedented solar detail in the photographs from Stratoscope I and the discovery and analysis of X-ray sources in the sky by the NRL Sounding Rocket Program. Recently, unmanned satellites have provided astronomers with their first steady look at celestial objects from above the atmosphere. These include: the Goddard/Ball Bros. OSO series for spectroscopic analysis of solar UV; the Soviet Cosmos 215 with its eight 2.76 inch telescopes for studying young stars in both the UV and visible spectra; and now the Goddard/Grumman 0A0-A2, the most sophisticated to date, with 11 telescopes ranging from 8 to 16 inches. But space astronomy will finally come into its own with large aperture manned systems, which can provide the long life and high reliability required to justify the investment.
Photographic systems used in the Apollo program for photometric data reduction require more precise methods of photometric calibration than are used to determine the nominal operating parameters. A technique for more precise photometric calibration of cameras is described. This involves the determination of the absolute magnitude and spatial variation of illuminance in the image plane of a camera as a function of the nominal operating parameters such as f-stop. The method involves photographing an artificial scene of known, luminance which has the same approximate color temperature as the object to be photo-graphed. This calibration must be performed at all of the various camera settings. The processed photographs will be scanned by a microdensitameter to analyze the transmission efficiency and the error in calibration of shutters and lenses. By use of this calibration information a determination of scene brightness can be made from image density when using the proper sensi-tometric calibration (density vs. log exposure).
Thermal deflection tests were run on two ULE fused silica mirror blanks, one solid and one of eggcrate construction, loaned to Itek Corporation by Corning Glass Works. The tests were accomplished by radiant heating of one surface while the deflection of the other surface was observed by means of holographic interferometry. Primary test objectives were (1) the mea-surement of an average thermal expansion coefficient for each blank, and (2) the detection of any unsymmetrical deflection. The thermal expansion coefficient values obtained were 1.1 x 10-7 per °C for the solid blank (30 inches in optical diameter by 4.15 inches thick) and 0.23 x 10-7 per °C for the eggcrate (see Fig. 1). No significant deviations from a uniform spherical deflection were seen. We believe that the error in these measurements does not exceed 0.15 x 10-7 per °C.
The panel discussion covered a variety of topics, emphasizing the most frequent failure modes and the available methods for decreasing their probability. Failure of the satellite itself, which has taken the heaviest toll of space-optics experiments, is of course, beyond the control of the experimenter. Failures and degradations within the optical sensor itself in the past have been most frequently due to the associated electronics. However components and techniques have been greatly improved and so this probably will no longer be true in the near future.
The application of television and silver halide emulsion in the field of planetary photography from space is discussed. A mission to the planet Mars is used as an example to define those operations in which each type of imaging system provides significant advantages. Various problems associated with mission planning, target selection, coverage, and resolution are related to the data storage capability and the data acquisition rate of the recording medium. Photography of a previously poorly defined surface requires a camera system that provides contiguous coverage in order to relate new photographs to earlier maps. Cartography using these pictures can best be done when relatively few views are taken that offer simultaneously high resolution and wide angle coverage. The high resolution and immediate information storage of a film system appear particularly useful in meeting the requirements for orbital reconnaissance of the planets.
Diverse connotations of angular resolution for different types of astronomy observations present conflicting requirements to orbital astronomy system designers as well as to optical instrument designers. Interpretations are offered of the physical meaning of angular resolution, especially as it relates to linear resolution in the focal plane, for various types of astronomy observations, including detection of faint sources, imaging of extended sources, separation of close stars, spectroscopy, polarirnetry, photometry, etc. Quantitative values of angular resolution requirements for each type of observation from a representative set of specific observations developed in a recent study are discussed. From these numerical examples, the collective influence of various angular resolution criteria on the design of systems and telescopes for orbital astronomy are also discussed. Finally, the role of system tradeoff analyses in deciding between diffraction-limited and non-diffraction-limited telescopes for orbital astronomy is explained.
A significant factor in the design of space and aircraft borne optical imaging systems is the image-object dynamic relationship. Image motions arise from two basic causes: apparent ground motion and camera rotation. Compensation for the effect of these motions is accomplished by controlled camera and optical rotations and image plane translations. System configuration choices and design optimizations are strongly influenced by image motion compensation requirements and implementation.
Although several spacecraft, both U.S. and Russian, have already explored some planets during flyby, e.g. Mariner and Zond, the concept of investigating planets in a systematic fashion from orbiting satellites at relatively close range is unique in the history of imaging technology. Nevertheless, NASA already is preparing the groundwork for the regular probing of planetary surfaces and atmospheres, during the 1970's and beyond, tnrough use of remote sensing techniques. Ranking high among the sensors to be selected for these missions unquestionably will be photographic imagers. This is so for several reasons: 1) photographic data (images) are simplest of all to interpret and extract information from; 2) data packing density is extremely high; 3) photographic systems, electro-optical as well as photo-optical, are versatile, reliable, and well developed; and 4) photography has already proven its usefulness in Gemini, Mariner, Tiros, Lunar Orbiter, etc. This is not to imply that photography is not without limitations or draw-backs; the adverse effect of certain planetary atmospheres and poor illumination on image quality is well appreciated. Still, the ad-vantages outweigh the disadvantages. Accordingly, this paper will examine a number of parameters associated with the acquisition of photographic images from planet-orbiting space-craft. The purpose being to indi-cate what can be expected in the way of performance from such photographic systems and also to highlight where some of the problem areas lie.
The concept of the all-reflecting Schmidt was first set forth by Hendricks and Christy (1939). Recently this concept has been brought into prominence by the work of L. Epstein (1967a,b, 1969) of the Chrysler Corporation (currently with Baus Optics). In 1965 Chrysler's Space Division was responsible for the construction of an all-reflecting Schmidt camera, 6" aperture, f/4.0. The project was supervised by Epstein who later invited Karl Henize of Northwestern University to co-author a joint proposal from Northwestern University and Chrysler Corporation Space Division to NASA with the objective of conducting an ultraviolet sky-survey from a manned orbiting spacecraft. When Henize was accepted into the Scientist-Astronaut Program in 1967, one of us (Wray) was appointed to succeed him in the project at Northwestern. The present paper presents the results of our efforts in this program from 1967 to the present time.
DR. MANGOLD: Dr. Mangold considered the problems of two broad categories of space optical systems -- earth resources and astronomical observations. The large amount of data collected by these satellites favor the screening of the data on board the satellites, and he suggested the use of astronaut scientists to maintain the equipment and evaluate the data in orbit. Because of the high cost of data collection there is a need for operational efficiency. There should be at least two types of astronauts, the technician or engineer who can maintain and repair the equipment, and the scientist who can evaluate the data. With the use of communications satellites of greatly increased band-width in ten years or so, all the data could be telemetered to the ground so that the scientist who would evaluate the data could be located on the ground rather than in space. Dr. Mangold recommended flexibility in instruments and programs so that the scientist astronaut could reorient the mission if necessary.