In 1972, at the first SPIE seminar covering the application of optical instrumentation in medicine, Balter and Stanton presented a paper forecasting the status of x-ray image intensifiers in the year 1980. Now, eight years later, it is 1980, and it seems a good idea to evaluate these forecasts in the light of what has actually happened. The x-ray sensitive image intensifier tube (with cesium iodide as an input phosphor) is used nearly universally. Input screen sizes range from 15 cm to 36 cm in diameter. Real time monitoring of both fluoroscopic and fluorographic examinations is generally performed via closed circuit television. Archival recording of images is carried out using cameras with film formats of approximately 100 mm for single exposure or serial fluorography and 35 mm for cine fluorography. With the detective quantum efficiency of image intensifier tubes remaining near 50% throughout the decade, the noise content of most fluorographic and fluoroscopic images is still determined by the input exposure. Consequently, patient doses today, in 1980, have not substantially changed in the last ten years. There is, however, interest in uncoupling the x-ray dose and the image brightness by providing a variable optical diaphragm between the output of the image intensifier tube and the recording devices. During the past eight years, there has been a major philosophical change in the approach to imaging systems. It is now realized that medical image quality is much more dependent on the reduction of large area contrast losses than on the limiting resolution of the imaging system. It has also been clear that much diagnostic information is carried by spatial frequencies in the neighborhood of one line pair per millimeter (referred to the patient). The design of modern image intensifiers has been directed toward improvement in the large area contrast by minimizing x-ray and optical scatter in both the image intensifier tube and its associated components.