In medical x-ray imaging, computed tomography is considered by many persons to be the most important innovation since introduction of the image intensifier. For evidence of the potential importance of this approach to x-ray diagnosis, one need look no further than the half-million dollars capital investment already committed by hundreds of medical institutions in the United States for purchase of a computed tomographic scanner. Realization of the maximum potential of computed tomography will depend not only upon achieving the full capability of computed tomographic units once they are installed in clinical settings, but also upon ensuring that the units operate at their full capability day after day, in an environment in which patient care takes precedence over equipment care. In this environment, it is essential that a physicist or engineer conduct a continuing quality assurance monitoring program to ensure acceptable and reproducible performance of the computed tomographic unit. In the following special series of articles on computed tomography, considerable advice is offered on the implementation and operation of a quality assurance program for computed tomography.
While the reconstruction algorithm utilized in computerized tomography (CT) is important, the overall performance of the system is limited by the quality of the measured transmission data which is used as a basis for the reconstruction process. If the projection values derived from the measured data do not adequately represent the line integrals of the linear attenuation coefficients within the slice being scanned, even a perfect reconstructruction algorithm will give rise to a distorted image. Phenomena which tend to deteriorate the quality of the measured data, and hence the final image, include the effective finite dimensions of the scanning aperture, distortions introduced by the detector system such as afterglow, and nonlinearities related to the spectral distribution of x-ray photons used in scanning. Computer methods of preprocessing the x-ray transmission data to minimize these distortions are discussed and illustrated.
Transverse axial positron annihilation coincidence detection tomography with circular ring transaxial positron camera (CRTAPC) will be described. Some of the merits and charac-teristics of the circular ring geometry in comparison to others will be discussed, and performances of the CRTAPC will be presented.
Two in vivo methods for the determination of bone mineral content are compared. It is concluded that computed tomography represents the only accurate in vivo method capable of determining bone mineral content per volume of bone. In addition, it has the distinct advantage of being able to determine the bone mineral content of virtually any bone of the skeleton.
A review of a series of qualitative and quantitative investigations of the capabilities of fluoroscopic systems to produce images having a sufficiently high quality to be used as the input for producing computerized transaxial tomograms is given. Examples of tomographic sections obtained from fluoroscopic image inputs are presented. In addition a quantitative comparison of computerized tomograms of a specially constructed phantom was made between reconstructions made with the EMI head scanner and those made from images provided by a large screen low light level TV camera fluoroscopic system. A phantom made from Lucite containing rods of various materials and sizes was used. The computer printout of each was analyzed and a high degree of correlation (r = 0.98) was noted between the results of both systems. The differential attenuation detectability of the fluoroscopic system was found to be comparable to or better than the EMI unit. As expected from a consideration of the quantum statistics for each system, the noise in the obtained reconstructions was also comparable. It is concluded that such a fluoroscopic system performs favorably when compared to the presently available commercial systems.
Computerized tomographic scanners have gained quick and widespread acceptance in diagnostic radiological practice. The cost of such units is currently about a half million dollars. Technologically, they are one of the most complicated pieces of equipment to be found in a radiology department. Because of the cost and complexity, it seems logical to set up performance specifications, acceptance tests, and a quality assurance program for a CT scanner. Pertinent performance specifications are herein described and discussed. In order to assure that the CT unit does meet specifications, appropriate acceptance tests are likewise discussed. Finally, a basic quality assurance program is outlined with an indication of the tests to be performed and their time frequency.
The computed-tomography scanner is a new tool for the medical profession in which a narrow, moving x-ray beam, controlled and measured by a computer, is used to image transverse planes in patients. Because scanners have several unique features, commissioning tests performed during installation can be only partially accomplished by using equipment and procedures designed for conventional x-ray devices. Additional equipment and procedures are required to commission scanners. This paper identifies 21 parameters that should be considered for testing in a scanner commissioning program. Generalized descriptions as to how such tests may be accomplished are outlined and difficulties in testing are cited.
With the advances in computer technology during the 1960s several authors, including Cormack, Kuhl and Edwards, began to discuss the possibility of improving the quality of ordinary x-ray tomography using exact mathematical reconstruction techniques. The basic method is to measure x-ray attenuation along sets of parallel lines oriented at multiple angles, all contained within a single transverse cross-sectional slice of the body. These measurements can be represented mathematically as linear combinations of the unknown mass attenuation coefficients in the slice. Standard mathematical techniques developed for the solution of simultaneous linear equations can be adopted to "reconstruct" the unknown cross-section distribution from such data. The result can be displayed as a two-dimensional array using computer generated display techniques, and closely resembles the image that could have been obtained if the cross-sectional slice were removed from the body and imaged by conventional projection radiography.
TOPICS: Tomography, Optical signal processing, Image restoration, Radio optics, Point spread functions, Digital signal processing, Reconstruction algorithms, Nuclear medicine, Signal attenuation, Sensors
Optical and digital processing of tomographic data is presented and discussed in the context of a Fourier reconstruction algorithm applied to phantom radionuclide sources of nuclear medicine. The point spread function of the reconstruction process is discussed in relation to the angular sampling of frequency space and investigated using coherent optical techniques. Approximations due to gammaray attenuation in the source and the non-stationarity of the point spread function of the detector system are also presented.
The development and use of a phantom for evaluation, comparison, and quality assurance of CT scanners will be discussed. Examples of measurements on seven CT scanners using early prototypes of the phantom will be presented along with measurements on several scanners using the final phantom configuration. The phantom contains four modular sections which are removable to allow for future fabrication and replacement of individual sections for specialized applications. Section I is used to measure contrast sensitivity and scan slice geometry of the system. Section II is used to measure the sensitometric response of the system. Section III is used to determine the spatial resolution of the system at various contrast levels. Section IV is used to determine the noise, spatial uniformity, alignment, and MTF of the system. In addition, it contains a part with fittings where items may be placed such as in vitro samples, dosimeters, or a motion phantom.
The necessary information to make a model of the night sky background for systems calculations in faint satellite detection analysis is summarized and presented in easily usable form. The intensities, wavelength dependences, and angular variations of airglow, zodiacal light, moonlight, and diffuse galactic light are tabulated. The number density of faint stars as a function of position and magnitude is summarized. Data on nebulae, clusters, and bright galaxies is referenced. A simple model of the galactic number density and angular size as a function of visible magnitude is presented. Atmospheric transmission corrections are referenced for general observers and given for a specific representative situation.
This paper briefly reviews the current status of speckle interferometry including the recent extension proposed by Knox and Thompson and the limitations imposed by non-isoplanicity. The speckle interferogram is characterized in terms of its scale lengths and photon statistics. The various subsystems of the instrument are reviewed in detail in terms of their required performance. The overall S/N is defined and discussed in terms of both unresolved and extended objects.
The contribution of particulate pollution to the visibility is demonstrated by applying a pulsed GaAs laser. The density of the "cloud" and the atmospheric attenuation coefficient were also determined. According to the definitions of multiple scattering, it was found that this process governs, in this case, the interaction between the laser light and the particles in the cloud.
The use of a Fabry-Perot interferometer for analyzing periodic spectra is described with special emphasis on the detection of rotational Raman spectra of gases. The interferograms of rotational Raman spectra have been generated experimentally by use of a specially designed air-bearing Fabry-Perot interferometer. Linear, symmetric-top and asymmetric-top molecular classes have been studied by this interferometric technique. The overall efficiency of this technique is compared quantitatively with that of a grating spectrometer. Rejection schemes, using auxiliary etalons, for reducing the effects of interfering spectra are analyzed. A computer program for simulating the Fabry-Perot interferograms has been used to study the variation of the interferogram with such parameters as gas temperature and cen-trifugal distortion constant. This interferometric technique has been used to determine the vibrational temperature, rotational temperature, and the rotation-vibration interaction constant of nitrogen gas which was heated by an electrical discharge. The possible application of this technique to the problem of remote detection of atmospheric pollutants is discussed.
The 1369th Photographic Squadron (MAC) at Vandenberg Air Force Base in California has performed a series of cold weather tests on eine cameras and a tracking mount. The tests were conducted to determine the capability of these equipments to operate successfully at the extreme cold temperatures to be encountered in support of the Operational Base Launch (O.B.L.) program. The testing was accomplished at the Pacific Missile Test Center (for merly Naval Missile Center) Environmental Laboratory, Pt. Mugu, California.