Missions to directly detect exoplanets with a coronagraph, such as the potential Habitable Exoplanet (HabEx) mission, require an optical telescope with extreme wavefront stability. The key systems engineering question is: ‘How stable?’ The poetic answer is 10 picometers per 10 minutes. But, what is the actual spatial distribution of this error allocation? This paper defines a science-driven systems-engineering process to derive the telescope’s wavefront stability error budget specification from the coronagraph’s performance; reviews previous studies into coronagraph sensitivity to wavefront error instability; and demonstrates the method by comparing the performance of coronagraph telescope combinations.
Direct detection and characterization of extrasolar planets has become possible with powerful new coronagraphs on ground-based telescopes. Space telescopes with active optics and coronagraphs will expand the frontier to imaging Earth-sized planets in the habitable zones of nearby Sun-like stars. Currently, NASA is studying potential space missions to detect and characterize such planets, which are dimmer than their host stars by a factor of 1010. One approach is to use a star-shade occulter. Another is to use an internal coronagraph. The advantages of a coronagraph are its greater targeting versatility and higher technology readiness, but one disadvantage is its need for an ultrastable wavefront when operated open-loop. Achieving this requires a system-engineering approach, which specifies and designs the telescope and coronagraph as an integrated system. We describe a systems engineering process for deriving a wavefront stability error budget for any potential telescope/coronagraph combination. The first step is to calculate a given coronagraph’s basic performance metrics, such as contrast. The second step is to calculate the sensitivity of that coronagraph’s performance to its telescope’s wavefront stability. The utility of the method is demonstrated by intercomparing the ability of several monolithic and segmented telescope and coronagraph combinations to detect an exo-Earth at 10 pc.
The first detected exoplanets found were "hot Jupiters"; these are large Jupiter-like planets in close orbits with their host star. The stars in these so-called "hot Jupiter systems" can have significant X-ray emission and the X-ray flux likely changes the evolution of the overall star-planetary system in at least two ways: (1) the intense high energy flux alters the structure of the upper atmosphere of the planet - in some cases leading to significant mass loss; (2) the angular momentum and magnetic field of the planet induces even more activity on the star, enhancing its X-rays, which are then subsequently absorbed by the planet. If the alignment of the systems is appropriate, the planet will transit the host star. The resulting drop in flux from the star allows us to measure the distribution of the low-density planetary atmosphere. We describe a science mission concept for a SmallSat Exosphere Explorer of hot Jupiters (SEEJ; pronounced "siege"). SEEJ will monitor the X-ray emission of nearby X-ray bright stars with transiting hot Jupiters in order to measure the lowest density portion of exoplanet atmospheres and the coronae of the exoplanet hosts. SEEJ will use revolutionary Miniature X-ray Optics (MiXO) and CMOS X-ray detectors to obtain sufficient collecting area and high sensitivity in a low mass, small volume and low-cost package. SEEJ will observe scores of transits occurring on select systems to make detailed measurements of the transit depth and shape which can be compared to out-of-transit behavior of the target system. The depth and duration of the flux change will allow us to characterize the exospheres of multiple hot Jupiters in a single year. In addition, the long baselines (covering multiple stellar rotation periods) from the transit data will allow us to characterize the temperature, flux and flare rates of the exoplanet hosts at an unprecedented level. This, in turn, will provide valuable constraints for models of atmospheric loss. In this contribution we outline the science of SEEJ and focus on the enabling technologies Miniature X-ray Optics and CMOS X-ray detectors.
The Habitable Exoplanet Observatory Mission (HabEx) is one of four missions under study for the 2020 Astrophysics Decadal Survey. Its goal is to directly image and spectroscopically characterize planetary systems in the habitable zone around nearby sun-like. Additionally, HabEx will perform a broad range of general astrophysics science enabled by 100 to 2500 nm spectral range and 3 x 3 arc-minute FOV. To achieve its exoplanet science goals, HabEx is baselining both an internal coronagraph and a star-shade. But, an internal coronagraph requires an ultra-stable wavefront. Achieving this stability imposes never before required performance specifications upon the telescope and requires a new approach systems engineering. The telescope and coronagraph must be specified and designed as an integrated system. This paper describes a two-step systems engineering process that can be applied to any potential telescope/coronagraph combination. The first step is to determine the coronagraph’s performance metrics of core throughput, raw contrast and stability of raw contrast. The second step is to calculate the sensitivity of the coronagraph’s performance metric to its telescope’s optical performance (e.g. wavefront stability). To illustrate the process, four representative architectures are evaluated: two vectorvortex and a hybrid Lyot coronagraph in combination with an off-axis monolithic telescope; and, an apodized pupil Lyot coronagraph with an on-axis hexagonal segment telescope (similar to the Webb Telescope aperture or the potential Large UV/Optical/IR Surveyor (LUVOIR) decadal mission).
Direct imaging of potentially habitable planets is challenging because of the relative proximity of the planet to the star and the low flux ratio (typically well under 1e-9 in the visible) of the planet relative to the star. Future exoplanet direct imaging telescopes like the Habitable Exoplanet Imaging Mission (HabEx) or the Large UV/Optical/Infrared Surveyor (LUVOIR) will hence require large collecting apertures with very low wavefront errors. The feasibility of these missions is in a large part dependent on the sensitivity of the achieved contrast at small working angles to imperfections and motions of the telescope optics. In past studies, we explored the effect of applying specific modes to segmented and monolith telescopes on the contrast leakage of a coronagraph. Here we present a revised analysis which, though not substantially different from the previous results, includes a mode careful theoretical examination of the issues involved. We conclude by highlighting the importance of the temporal characteristics of the errors.
“Are we alone in the Universe?” is probably the most compelling science question of our generation. To answer it requires a large aperture telescope with extreme wavefront stability. To image and characterize Earth-like planets requires the ability to block 1010 of the host star’s light with a 10-11 stability. For an internal coronagraph, this requires correcting wavefront errors and keeping that correction stable to a few picometers rms for the duration of the science observation. This requirement places severe specifications upon the performance of the observatory, telescope and primary mirror. A key task of the AMTD project (initiated in FY12) is to define telescope level specifications traceable to science requirements and flow those specifications to the primary mirror. From a systems perspective, probably the most important question is: What is the telescope wavefront stability specification? Previously, we suggested this specification should be 10 picometers per 10 minutes; considered issues of how this specification relates to architecture, i.e. monolithic or segmented primary mirror; and asked whether it was better to have few or many segments. This paper reviews the 10 picometers per 10 minutes specification; provides analysis related to the application of this specification to segmented apertures; and suggests that a 3 or 4 ring segmented aperture is more sensitive to segment rigid body motion that an aperture with fewer or more segments.
Characterizing surface roughness is important for predicting optical performance. Better measurement of surface roughness reduces polishing time, saves money and allows the science requirements to be better defined. This study characterized statistics of average surface roughness as a function of polishing time. Average surface roughness was measured at 81 locations using a Zygo® white light interferometer at regular intervals during the polishing process. Each data set was fit to a normal and Largest Extreme Value (LEV) distribution; then tested for goodness of fit. We show that the skew in the average data changes as a function of polishing time.
Characterizing surface roughness is important for predicting optical performance. Typically, this is accomplished
by taking multiple statistically independent measurements and averaging. But, this approach assumes that the
statistical distribution of the roughness has a Gaussian (normal) probability distribution. Our analysis shows
that this assumption is wrong. Real data acquired from two different sets of telescope optics indicates that
roughness of highly polished surfaces is skewed and is best described by a largest extreme value probability
(LEV) distribution. Assuming a normal distribution and simply averaging overestimates the most probable
surface roughness and could result in the expenditure of unnecessary polishing effort.
Acoustic experiments demonstrate a novel approach to ranging and detection that exploits the properties of a solvable chaotic oscillator. This nonlinear oscillator includes an ordinary differential equation and a discrete switching condition. The chaotic waveform generated by this hybrid system is used as the transmitted waveform. The oscillator admits an exact analytic solution that can be written as the linear convolution of binary symbols and a single basis function. This linear representation enables coherent reception using a simple analog matched filter and without need for digital sampling or signal processing. An audio frequency implementation of the transmitter and receiver is described. Successful acoustic ranging measurements are presented to demonstrate the viability of the approach.
We demonstrate a new method for electronic beam steering in ultra-wide bandwidth array antennas based on
synchronized chaos. Chaotic oscillators generate random-like waveforms that may be well-suited for highly
unconventional ultra-wideband radar and spread-spectrum communication applications. The broadband and nonrepeating
nature of chaos provides an ideal combination of high range resolution with no range ambiguity. Unlike true
random sources, coupled chaotic oscillators can synchronize for coherent power combining. To steer the array, a small
detuning is applied to each oscillator to slightly shift its natural frequency. Oscillators that are tuned to run faster will
lead those tuned slower, providing a small time shift between the waveforms produced by each oscillator. The approach
avoids the need for costly phase shifters or tunable true time delay elements. Our demonstration system consists of a
linear array of four directionally coupled radio frequency chaotic oscillators, each of which produces a broadband
waveform centered at 137 MHz. Each individual oscillator feeds one of four discone-type antennas spaced a third of a
wavelength apart. We present far-field power level measurements characterizing beam formation and steering recorded
on an outdoor test range. Our results suggest chaotic arrays could enable a new generation of low-cost, highperformance,
ultra-wide bandwidth applications.