This PDF file contains the front matter associated with SPIE Proceedings Volume 7071, including the Title Page, Copyright information, Table of Contents, Introduction (if any), and the Conference Committee listing.
The Hubble Space Telescope is by far the most successful scientific spacecraft ever launched. According to the Science News metric, Hubble has produced 25% of NASA's scientific return since its launch in 1990, or roughly 1.5% of world-wide discovery in all fields of science, from archeology to biomedicine to zoology. Yet despite this outstanding success, Hubble has also been expensive, totaling $15B* in FY09$. There are many lessons to be learned from Hubble's history concerning how to execute scientific missions more cost effectively, and fortunately some of those lessons have already been demonstrated on programs such as the Chandra X-ray Observatory, while others are being carried forward in the spacecraft of today.
The Cosmic Hot Interstellar Plasma Spectrometer (CHIPS) observatory launched on 12 January 2003, and
was the first and only successful GSFC UNEX (NASA Goddard Spaceflight Center University Explorer
class) mission. The UNEX program was conceived by the National Aeronautics and Space Administration
(NASA) as a new class of Explorer mission charged with demonstrating that significant science and/or
technology experiments can be performed by small satellites with constrained budgets and a limited schedule.
The purpose of the observatory was to examine details of the local bubble thermal pressure, spatial
distribution and ionization history. The observatory was also used to observe solar spectra, both scattered
from the Lunar surface and via a fortuitous 2nd order scattering path. CHIPS confirmed that spectral features
within the 90-260Å band were much dimmer than was predicted by contemporary theories, and operated four
years beyond its design lifetime. The observatory was placed in an extended safe-hold mode in April of 2008
for budgetary purposes. The spectrometer consisted of six spectrograph channels which delivered >λ/100
resolution spectra to a single detector. Cost constraints of UNEX led to a design based on a traditional
aluminum structure, and an instrument with a large field of view (5° x 26°). All optical and optomechanical
systems on the spectrometer performed flawlessly on orbit. We discuss the challenges, difficulties and
lessons learned during the design, fabrication and execution stages of the mission.
Although many designs are evaluated in the design stage by examination of their MTFs, three-bar resolution is often
used for the determination of resolution in practice. In certain applications, the measured three-bar resolution differs
greatly from the resolution obtained with crossing the MTF curve with a threshold curve. In this presentation, we
examine conditions under which this can occur.
Several examples are given of optics apparently specified only by figure and finish. Although these optics met the
specifications they did not produce good images. The presumed reason for the poor performance was the lack of a
specification for mid-spatial frequency roughness. We show that a reasonable specification can be applied using the
concept of a structure function, a mathematically simple function easily calculated from interferometric phase data at
each pixel. An example wavefront is used to show how the specification can be developed from typical figure and finish
specifications and include information about roughness in the mid-spatial frequency region.
A much abbreviated colloquial collection of vignettes and true take-aways based on embellished fact. Names & situations have been intentionally changed to make a point, to avoid liability, and to protect the not-so-innocent.
"Rondo: an instrumental composition typically with a refrain recurring four times in the tonic and with three couplets in
contrasting keys." --G. & C. Merriam Co. New York 1973
The composition will be played on three instruments, a cryogenic space surveillance sensor, a document scanner and an
optical image correlator. The performer will take the liberty of including an introduction and a coda. After the first
couplet the listeners may sing along with the performer.
Due to their scale, operating environment, and required levels of operating precision, the design of the next generation of
space-based observatories will necessarily place an ever-greater reliance on numerical simulation. Since it will be
impossible to fully ground-test such systems prior to flight, system-level confidence must come, in large part, from
correlated subsystem tests, system-level simulation, and an overall design understanding based on quantification of
margins of uncertainty, sensitivity analyses, parameter variation studies, and design optimization. Further challenges
will necessarily arise due to the actively-controlled nature of such systems, requiring fundamentally-integrated thermal,
structural, optical, and controls models. In this paper we will discuss Cielo, JPL's multidisciplinary, high-capability
compute platform for systems analysis, and describe some of the challenges in demonstrating these capabilities for the
first time on a complex model, the Space Interferometry Mission's Thermal-Structural-Optical (SIM-TOM3) testbed.
The successes and lessons learned from these activities have the potential to greatly influence subsequent test programs,
leading to greater design understanding, improved mission confidence, and significant cost and schedule reductions.
The development of space-borne Electro-Optical (EO) sensors for both NASA and the DOD is currently overrunning initial cost and schedule estimates by 2x to 3x, in an environment where Congressional review and even cancellation of these programs is triggered by more modest overruns of 20% to 30%. A substantial improvement in the cycle time of development for these systems is needed while retaining adequate sensor performance levels with high reliability. Concurrent engineering practices promise significant reductions in project cost and delivery time.
The term "fuzzy metrology" is almost as misleading as the term "fuzzy logic." Many people simply avert their eyes from
fuzzy logic, because they know (correctly) that good logic is not fuzzy but rigorous. But, neither the measured result nor
the logic itself is fuzzy in any conventional sense. Many sets are fuzzy, so they must be handled not by crisp logic but by
fuzzy logic. Fuzzy logic itself is quite rigorous. Likewise, fuzzy metrology uses the mechanisms of fuzzy logic to arrive
at precise measurements of physical things. In itself it is rigorous. You avert your eyes at your peril. After a general
discussion of the basics of fuzzy logic, I show a number of examples that allow the user even better understanding of
when fuzzy metrology might be superior to conventional crisp metrology.
In September 1959, Theodore Maiman attended the first International Quantum Electronics Conference to
present a paper describing an exceptionally compact microwave-emitting ruby maser he had developed at the
Hughes Research Laboratories. On May 16, 1960 he succeeded in demonstrating the first working laser, also
using ruby, a historic breakthrough that stunned others trying to develop a working laser. Maiman's success,
described in my book Beam: The Race to Make the Laser (Oxford, 2005) teaches some important lessons in
taking on challenging optical tasks.