Open Access
6 October 2021 From EMI to AI: a brief history of commercial CT reconstruction algorithms
Author Affiliations +
Abstract

Computed tomography was one of the first imaging modalities to require a computerized solution of an inverse problem to produce a useful image from the data acquired by the sensor hardware. The computerized solutions, which are known as image reconstruction algorithms, have thus been a critical component of every CT scanner ever sold. We review the history of commercially deployed CT reconstruction algorithms and consider the forces that led, at various points, both to innovation and to convergence around certain broadly useful algorithms. The forces include the emergence of new hardware capabilities, competitive pressures, the availability of computational power, and regulatory considerations. We consider four major historical periods and turning points. The original EMI scanner was developed with an iterative reconstruction algorithm, but an explosion of innovation coupled with rediscovery of an older literature led to the development of alternative algorithms throughout the early 1970s. Most CT vendors quickly converged on the use of the filtered back-projection (FBP) algorithm, albeit layered with a variety of proprietary corrections in both projection data and image domains to improve image quality. Innovations such as helical scanning and multi-row detectors were both enabled by and drove the development of additional applications of FBP in the 1990s and 2000s. Finally, the last two decades have seen a return of iterative reconstruction and the introduction of artificial intelligence approaches that benefit from increased computational power to reduce radiation dose and improve image quality.

1.

Introduction

The algorithms deployed on commercial CT scanners have evolved significantly over the last fifty years, with periods of relative algorithmic stability punctuated by occasional rapid transitions in response to changes in scanner hardware, computational hardware, and clinical needs. In this historical review, we trace the evolution of commercially deployed CT reconstruction algorithms over the last fifty years and connect these advancements to the forces that continue to drive innovation.

The forces that led to the evolution of CT scanners and their embedded reconstruction algorithms were numerous and interrelated. Put simply, there were pressures to scan patients better, faster, and cheaper, all while controlling or reducing radiation dose. The source of this pressure was multifold: clinicians seeking to increase the clinical applications of CT; hospitals and vendors wanting to increase the revenue generated by CT; vendors competing for market share and seeking to increase the market size; regulatory agencies encouraging reduction of dose from ionizing radiation due to increasing clinical usage of CT; and competitors and public and private health insurers seeking reductions in the cost of the equipment and its operation.

These forces led to scanners that could collect sufficient data to satisfy the mathematics of the reconstruction algorithms using a single x-ray source and an array of detectors configured as a fan-beam, as well as higher-power x-ray sources, slip-ring gantries, faster and higher-resolution detectors, multi-row detectors, and more powerful reconstruction computers. This evolution of CT scanner hardware and the resulting new clinical applications necessitated parallel improvements of image reconstruction algorithms, with innovation in image reconstruction sometimes leading the improvements in image quality, as in the development artifact-reduction algorithms, and sometimes following the lead of the hardware, as when multi-row detectors were developed.

The paper is organized as follows. Section 2 describes the beginning of medical x-ray CT by reviewing the work done at EMI by Hounsfield; in addition, that part also looks at the precursors to his work. Section 3 describes how the clinical usefulness of CT was increased through the 1970s and 80s using algorithmic improvements to improve image quality; in that era, the algorithms were based on the assumption that the patient had to be static during data acquisition. Section 4 deals with relaxing the assumption of a static patient; this work led to helical scanning and improved cardiac scanning through motion compensation in the 1980s and 1990s. Finally, Sec. 5 deals with image quality improvement and dose reduction using iterative algorithms and emerging approaches in artificial intelligence and machine learning that have taken place in the last twenty years.

The details behind our high-level overview can be found in several seminal and excellent publications by Newton and Potts,1 Kak and Slaney,2 Webb,3 Kalender,4 and Hsieh.5 We provide citations for key advancements in the field. However, we note that there may be controversies, which we will not address, on who deserves credit for initiating these advancements due in part to proprietary vendor information and the vagaries of the patent process. We also note that we may not know exactly how the vendors reconstructed their images at various points; instead, we assume that publications and patents written by their employees and their external collaborators reflect their reconstruction algorithms.

2.

Beginning: Hounsfield and Precursors

In the late 1960s, as Hounsfield was developing the EMI scanner, there was an outburst of research activity in tomographic image reconstruction, in fields ranging from radio astronomy to electron microscopy, but there is no evidence that Hounsfield was aware of any of these developments or, indeed, that most of these researchers in disparate disciplines were aware of each other. In radio astronomy, Bracewell derived the central slice theorem using Fourier-space arguments to show that two-dimensional (2D) functions can be reconstructed from a set of one-dimensional line integrals and proposed a direct-Fourier algorithm for reconstruction.6 This was applied to reconstructing images from radio telescope data that could sample astronomical objects like the sun with long, narrow sensing functions. Bracewell was a leading expert on Fourier transforms, and published a classic text on them in 1965,7 but he also understood that they were computationally expensive at the time, as the fast Fourier transform algorithm of Cooley and Tukey was not published until 1965.8 As an alternative, Bracewell and Riddle9 developed a convolution-back-projection style image reconstruction algorithm that avoided the use of Fourier transforms. Electron microscopists also traversed much of this same intellectual ground, with De Rosier and Klug using Fourier-based methods to reconstruct 3D images of viruses while Ramachandran and Lakshminarayanan derived the same convolution-back-projection algorithm as Bracewell and Riddle.10,11 As noted by Webb in his seminal history of tomography,3 only Gilbert,12 who also derived convolution-back-projection in the electron microscopy context, seemed to be aware of its independent discovery in radio astronomy and by Ramachandran and Lakshminarayanan.

Hounsfield’s seminal patent, however, cites only two predecessors: William Oldendorf and Alan Cormack.13 Oldendorf was a UCLA neurologist who took pioneering steps toward transaxial tomography using combinations of linear and rotational motion to isolate focal changes in attenuation in the interior of an object, but the method does not involve computational reconstruction.14 In the EMI patent, Hounsfield remarks that Oldendorf’s scanning scheme is slow and dose inefficient relative to the one he is proposing. Cormack was a physicist at the University of Cape Town who became interested in radiation therapy and the need for improved multidimensional maps of tissue attenuation properties. In the mid-1950s, he conceived an approach to tomography similar to Hounsfield’s but developed a somewhat cumbersome approach to reconstruction involving expansions of functions in Chebyshev polynomials.15 The EMI patent suggests that Cormack’s reconstruction approach is computationally inefficient and subject to noise amplification. It is perhaps worth noting, however, that in the early 1970s, Cormack became aware of some of the related work in image reconstruction in radio astronomy, electron microscopy, and even the older mathematical work of Radon and deployed some of it in the context of proton tomography.16

For his part, in developing the EMI scanner in the late 1960s, Hounsfield pursued a very different approach to reconstruction based on framing the problem discretely as one of solving a large system of linear equations. On an EMI prototype, Hounsfield implemented a relaxed iterative reconstruction algorithm that is outlined in some detail in the seminal EMI patent. He describes reconstructing images with 100×100  pixels from 400 views, each involving 100 line integral samples and thus needing to solve a system of 40,000 equations with 10,000 unknowns. Hounsfield employed an iterative update scheme to solve the equations but also a number of clever implementation details, such as cycling through the projections, not sequentially but with large 40-deg steps; a step that improves convergence speed and may anticipate the ordered subsets approach of Hudson and Larkin.17

While the iterative algorithm implemented for the EMI prototype scanner at the Atkinson Morley Hospital worked well and produced the world-changing image (Fig. 1) of a brain tumor acquired on October 1, 1971, computational considerations prevented its deployment with the commercial scanner. The computational expense of the iterative algorithm required that scan data be transported from the hospital back to EMI on tape so that it could be processed overnight on a mainframe computer, the ICL 1905. The reconstructed images were then returned to the hospital on tapes to drive a cathode-ray tube (CRT) display, or as polaroid pictures taken of a CRT display at EMI.18 This kind of centralized image reconstruction was obviously not practical for the commercial units EMI hoped to sell, but the advent of the minicomputer provided hope of scanner-side reconstruction. However, the minicomputers of the day were not able to run the iterative algorithm in clinically useful time frames, so reconstruction shifted to a filtered back-projection (FBP) style method, developed and patented by EMI’s Chris Lemay.19 This allowed for reconstruction of a 160 × 160 image in 30 s on the minicomputer.18

Fig. 1

First EMI image of first patient scan demonstrating a cystic astrocytoma.

JMI_8_5_052111_f001.png

In the development and evolution of reconstruction algorithms for the first commercial CT scanner, we see at play several of the forces described in the Introduction. The advent of novel computational hardware, in the form of the minicomputer, made scanner-side image reconstruction possible and thus the first scanner a viable commercial product, while the computational limits of that hardware quickly led to a change in the algorithm deployed. All of this was driven by the anticipated clinical demand for the first cross-sectional images of human neuroanatomy.

3.

Clinical Applications and the Market Conquers Reconstruction

The algorithms in the early CT literature mostly assumed that perfect line integrals of the x-ray linear attenuation coefficient could be obtained from a cross-section of a patient. Furthermore, these line integrals were assumed to be obtained from a perfectly static patient. To increase the clinical efficacy of CT, the hardware was improved to scan more rapidly and the reconstruction algorithms were improved because perfect line integrals could not be measured.

The hardware changes included the development of rotate-rotate scanners (also known as third-generation scanners) with x-rays delivered by higher-power x-ray sources and detected by a single row of detectors configured as a fan-beam. To combat motion, manufacturers added breathing lights to instruct patients on when to hold their breath and cardiac monitoring sensors. These hardware changes enabled a host of transformative clinical applications. Rather than acquiring a series of slices suffering from motion and misregistration artifacts due to patient breathing, it was now possible to acquire a number of consistently registered slices. Cardiac imaging was enabled by cardiac gating that could be synchronized with acquisition. Finally, exogenous contrast agents could now be used in more sophisticated ways due to increased speed and timing precision. The forces that drove these changes included increased clinical demand, especially for body scanning, and competition among the vendors. Most of this work took place beginning after the EMI scanner was introduced in the early 1970s through the middle 1980s.

To improve image quality, the reconstruction algorithms had to address the reality that real measurements were not perfect line integrals. The resulting reconstruction algorithms were required to correct the imperfections in the line integrals and these corrections often required calibrations to be performed. A block diagram of the resulting reconstruction algorithm is shown in Fig. 2. The corrections addressed imperfections in physics, instrumentation, patient, and mathematics. The corrections for physics sought to address x-ray scatter and beam-hardening, which is caused by the interaction of polychromatic x-ray beams with materials having different energy dependence. The corrections for instrumentation address non-idealities in the source such as power fluctuations and off-focal radiation; non-idealities in the detector such as electronic noise, crosstalk, nonlinear response, radiation damage, afterglow, and temperature drift; and non-idealities in the gantry, which can suffer imperfections in its mechanical motion. The corrections for the patient seek to address respiration and cardiac motion as well as the presence of metal due to prosthetics and dental fillings. Finally, the corrections for mathematics sought to address aliasing and the effects of a finite number of projections and detectors. The details of these corrections are presented by Hsieh.5

Fig. 2

Block diagram of a reconstruction algorithm before the advent of helical and multi-slice CT. The top row shows the key elements of the scanner and the patient that need to be addressed in the reconstruction algorithm. The middle row shows the reconstruction algorithm. The step labeled inverse Radon transform is the mathematical step of reconstructing an image from its projections. The additional steps in the middle row are used to correct for imperfections in the x-ray measurements. The bottom row indicates that the scanner has to be calibrated to correct the x-ray measurements as shown in the middle row.

JMI_8_5_052111_f002.png

These algorithmic advancements were described in a series of papers and patents emerging from major vendors, academic medical centers, and U.S. federal organizations like the National Institutes of Health. An effort to correct the blooming artifacts at the bone-brain interface was made using a second pass reconstruction algorithm to correct for the beam hardening caused by bone, which is different than the beam hardening caused by soft tissue.20 Aliasing artifacts caused by under-sampling the fan beam detector were mitigated by interlacing the fan-beam projections by displacing the detector array by a quarter of the detector spacing,21 although in practice the transition to smaller detector cells and the introduction of flying focal spots have come to play an equally important role.22 Scatter artifacts were reduced by subtracting a scaled and low-pass filtered version of projections from the original projections.23 Ring artifacts were suppressed in projection space or image space,24,25 as seen in Fig. 3. Metal artifacts were reduced by estimating the correct values of the samples of projections that passed through the metal.26 Patient motion during body scanning was reduced by smoothing the discontinuity caused by motion between the first and last projections in a scan.27 Initial attempts to scan the heart were accomplished by scanning the heart through multiple cardiac cycles and then sorting the projections according to the cardiac cycle.28 Cardiac scanning was also enabled by using less than a full rotation of the scanner using a method denoted half-scan.29

Fig. 3

An example of early progress in ring artifact suppression. The display window is −120 to 149 HU and the arrow points to the center of rotation.24

JMI_8_5_052111_f003.png

4.

Overcoming the Assumption of a Static Patient

Through the middle 1980s, scanners were improved to decrease scan time to reduce patient-induced motion artifacts. However, this required patients to hold their breath multiple times, e.g., during a chest scan, to scan a complete organ or section of the body. The multiple breath holds led to registration errors and limited the axial extent that could be scanned and the ability to display artifact-free sagittal and coronal slices. Scanners on the market at this time used cables to power the x-ray source and download data that led to turning off the x-ray source when the gantry reversed direction.

It was obvious at that time that more scans per unit time could be obtained if the cables were replaced with a slip ring to supply power and transfer data so that the gantry could be rotated continuously during data acquisition. The continuous rotation combined with translating the patient on the patient table continuously during data acquisition would allow more efficient use of time but violated the assumption of a static patient. This type of scanning became known as helical (or spiral) scanning.30 The reconstruction algorithms were based on the observation that multiple sets of projections exist and can be interpolated in each rotation of the gantry as shown by Crawford and King and Kalender.31,32 The interpolation algorithms reduced most of the artifacts caused by translating the patient but led to a degraded slice-sensitivity profile (SSP). Because whole organs could be now be scanned without registration errors, the slightly degraded image quality was tolerated because it led to greatly increased clinical utility. These initial helical scanners used a single row of detectors.

The development of helical scanning happened in parallel at most of the major CT vendors and eventually led to all scanners having helical capability. It seems to be an illustration of the adage that necessity is the mother of invention, as all vendors felt the clinical and financial pressure to increase scanning speed, and converged on the same technical solution in similar timeframes. For example, GE and Siemens presented their initial work on helical scanning at the same RSNA meeting in 1990. At GE, Crawford and King built on other work dealing with helical scanning by Toshiba and at the University of Illinois.33,34 The use of slip rings for CT scanning was demonstrated by the Mayo Clinic in their multi-source cardiac scanner, which was known as the Dynamic Spatial Reconstructor (DSR), by Varian, and by Artronix in their hybrid fourth-generation body scanner and discussed in an earlier patent.3537 More background information about simultaneous inventions as related to helical scanning is described by Schwartz.38

It was again evident after the advent of helical CT with a single row of detectors that more slices per unit time could be collected by replacing the single row of detectors in the fan-beam with multiple rows of detectors. This configuration is also known as helical multi-slice, which was inspired by EMI’s use of multiple detectors to scan multiple slices of the skull at the same time. The initial multi-slice scanners used a small number of rows (e.g., four) and the divergence of the beam in the axial direction, which is also known as cone-beam divergence, was ignored.39

The interpolation methods used for single-slice CT were extended to multiple slices and the number of slices doubled annually over this era, from 4 to 256. The reconstruction algorithms corrected the cone-beam divergence by back-projecting along the paths over which the rays were collected. For the 16-slice scanners, approximate methods based on rebinning and reconstructing tilted slices were used.4042 Among other benefits, these single-slice rebinning approaches allowed continued use of the dedicated 2D back-projection hardware that had been developed and optimized for single-slice CT scanners.

As the number of slices reached 64, these approximations began to fail and new approaches were needed. From 2001 to 2008, there were numerous breakthroughs in analytic CT helical cone-beam reconstruction, kicked off by Katsevich’s43,44 announcement of an exact solution to the helical cone-beam problem at the 2001 meeting on Fully 3D image Reconstruction in Nuclear Medicine and Computed tomography. The original formulation of the Katsevich algorithm placed somewhat rigid requirements on scan geometry, choice of helical pitch and did not fully use all collected data. Many of these issues were addressed in a series of other advances pioneered by the groups of Pan at the University of Chicago,45 Noo at the University of Utah,46 and others.47,48 However, it seems that these exact approaches did not find their way directly into commercial use for a variety of reasons, including implementation complexity and noise uniformity issues.49,50 In practice, most commercial systems used approximate methods based on extending the Feldkamp–Davis–Kress51 reconstruction to helical cone-beam scanning trajectories initially formulated by Wang et al.52,53 All of these algorithms shared some similar features, including filtering in the detector plane along lines that are tilted with respect to the canonical coordinates. At root, the mathematical exactness of analytical algorithms held only in the case of continuous, consistent, noiseless data. When faced with sampled, inconsistent, noisy data, the analytical algorithms had to compete with approximate algorithms that proved to be robust in the face of these non-idealities. Figure 4 shows the clinical benefit of the move to increasing numbers of detector rows: CT became a truly volumetric modality, able to be reformatted into sagittal and coronal cross-sections with minimal motion artifacts.

Fig. 4

The clinical benefit of increasing the speed of CT scanners through increasing the number of rows. With this change, CT made the transition to a truly volumetric modality, able to render the body with isotropic resolution. (a) The 16-slice coronal image shows motion inconsistencies in the heart. These are greatly reduced in (b) the coronal image acquired with a 64-slice scanner.

JMI_8_5_052111_f004.png

Patient motion continued to be addressed through the middle 1990s by reducing scan times and, in the case of cardiac scanning, triggering data acquisition to occur during the quiescent cardiac phases.54 FBP was also extended, based on developments from MRI reconstruction, to compensate for patent motion if the patent motion is known.55,56

5.

Back to the Future: The Return of Iterative Algorithms and the Ascent of Artificial Intelligence

The hegemony of FBP and its variants was finally challenged in 2009 with the FDA’s approval of the first iterative reconstruction algorithm, Siemens’ iterative reconstruction in image space (IRIS). This represented something of a return to the roots of commercial CT, since Hounsfield’s EMI scanner was originally developed using iterative reconstruction, as described already. However, IRIS was a small step back to the future, since the iteration takes place in image space to reduce the noise in an image initially reconstructed using an FBP-like algorithm and can be viewed in some ways simply as a sophisticated, noise-reducing post-processing step applied to FBP.

IRIS did, however, represent an important shift in approach that would soon accelerate as all major vendors introduced multiple generations of iterative algorithms over the next few years. Following Willemink and Noel,57 it is helpful to distinguish among algorithms that (1) iterate only in image space (image-restoration algorithms), (2) iterate only in data space (sinogram-restoration algorithms), (3) iterate in both spaces (sinogram- and image-restoration), and, finally, (4) iterate fully between the two spaces with multiple forward and back-projections (fully iterative algorithms). Most vendors developed algorithms in the third category (GE’s ASIR, Phillips iDOSE4, and Siemens’ SAFIRE), which allowed for a good balance between image quality and computational burden. However, GE eventually pioneered a fully iterative reconstruction algorithm called VEO.

VEO represented an algorithm that sought to model the geometry of source and detector in an effort to improve spatial resolution and reduce partial volume artifacts. It was developed through a long and fruitful industrial-academic partnership between GE and researchers at Purdue and Notre Dame. Figure 5 reproduces a figure from a key joint paper.58 While a significant amount of research on iterative reconstruction for CT was taking place separately in universities and at the major CT vendors, it is significant that a collaboration between the two brought the first algorithm to commercial fruition. The reasons for this are well captured by the influential article, “Why do commercial CT scanners still employ traditional, FBP for image reconstruction?” by Pan et al.,59 published appropriately enough just as the FBP era began to wane. The article argues that industrial engineers needed to see evidence that new algorithms would work well on real CT data before investing time in developing such algorithms for commercial use. However, most real commercial CT data are inaccessible to academic researchers, because it requires numerous proprietary corrections to account for system imperfections. To move forward required a true partnership, in which academic researchers with novel algorithms were allowed to open the hood of a CT scanner and work with industry employees to model its geometry and apply proprietary corrections.

Fig. 5

An example of the dose-reduction and resolution-enhancement capabilities of iterative, model-based image reconstruction (MBIR) algorithms. Reconstructed by (a) FBP and (b) MBIR algorithm.58

JMI_8_5_052111_f005.png

VEO struggled to find market traction, however, in part because of very long reconstruction times, and eventually GE offered a hybrid algorithm called ASIR-V, while other vendors developed their own fully-iterative and hybrid model-based approaches, struggling to balance image quality improvements with computational burden. In a related vein, some vendors are moving towards so-called artificial-intelligence (AI)-based approaches in which large neural networks are trained to reconstruct low-noise CT images from relatively high-noise raw data using large sets of training pairs.60,61 This might involve, for example, pairing high-dose gold standard images with matched low-dose images so that the system can learn to produce high-quality images from low-dose data. While the training process is very computationally expensive, the process of reconstructing an image using a trained network is non-iterative and can be made efficient depending on the architecture of the network. This relies on the advent of modern GPUs, another example of computational hardware enabling a shift in reconstruction algorithm.

What took so long for iterative algorithms to make their way into clinical use? While computational power had obviously grown exponentially since the advent of the EMI Mark I, so too had CT data sizes, with images of 10,000 pixels described in the Hounsfield patent having grown to be image volumes of more than 100 million voxels. Likewise, as scanning grew faster, the number of patients who could be scanned on a given machine grew, increasing the number of scans that needed to be reconstructed. The complexity of iterative reconstruction scales as the square of the image array size since the raw data size typically scales similarly and the size of the system matrix is given by their product. This translates into a 100-million fold increase in CT reconstruction complexity, which turns out to be nearly identical to the growth in computing power over the same period of 1970–2020. Moreover, the growth in CT data size was uneven, with one especially rapid acceleration coming during the “slice wars” of 1998–2004, when the number of rows in multi-slice CT scanners doubled every year, growing 64-fold as scanners evolved from 4 to 256 rows in six years. Computational power only grew about 8 fold in that same six-year period, in line with Moore’s law that computational power doubles every two years.

Another potential hurdle to adoption of iterative algorithms was regulatory: a change in reconstruction algorithm requires review and approval by regulatory agencies such as the Food and Drug administration (FDA) in the United States, a somewhat time-consuming and expensive step that is only worth pursuing if the algorithms are likely to give some commercial advantage to the vendor and, better yet, if the vendor can gain approval to make marketing claims about the algorithm’s advantages. This required novel approaches on the part of both vendors and regulators, since the new algorithms were non-linear and existing image quality metrics regarding noise, resolution, and dose reduction were of questionable value in the face of non-linearity. The issues and proposed solutions are well explained in an article by scientists from the FDA’s Center for Devices and Radiological Health, which describes a variety of objective task-based metrics that can be assessed and compared for the purposes of making claims about either image quality improvements or radiation dose reduction under conditions of fixed image quality.62 A wide variety of studies, using physical measures of noise and resolution, radiologists’ subjective assessment of image quality, and more objective measures based in signal detection theory have shown that iterative reconstruction approaches allow image quality and diagnostic performance to be preserved while reducing radiation dose.63,64

6.

Conclusion

A high-level review of the history of CT reconstruction algorithms has been presented. These reconstruction algorithms were developed in part to increase the clinical usefulness of CT. The image quality of CT slices is significantly better than the image quality in EMI’s first scanner and many more slices can be collected per unit time allowing whole-body scanning and snapshot imaging of rapidly moving organs. While it may seem that CT has reached maturity as an imaging modality, all the forces that led to the developments described here are still in play and we expect that they will continue to drive additional improvements in scanner hardware and algorithms to further increase the clinical utility of CT.

Disclosures

Patrick La Riviere has received past research funding from Toshiba Medical Research Institute USA and current research funding from Accuray Inc. Carl Crawford has been an employee of Elscint, GE Medical Systems, and Analogic, and has been a consultant to GE Healthcare, Toshiba Medical Systems, Philips Medical Systems, Mobius Imaging, and Tribogenics.

Acknowledgments

The authors acknowledge helpful discussions from Harry Martz, Ge Wang, and Norbert Pelc, as well as very constructive comments from three anonymous reviewers.

References

1. 

Radiology of the Skull and Brain: Technical Aspects of Computed Tomography, 5 C.V. Mosby Company(1981). Google Scholar

2. 

A. C. Kak and M. Slaney, Principles of Computerized Tomographic Imaging, IEEE Press(1988). Google Scholar

3. 

S. Webb, From the Watching of Shadows: The Origins of Radiological Tomography, IOP Publishing(1990). Google Scholar

4. 

W. Kalendar, Computed Tomography: Fundamentals, System Technology, Image Quality, Applications, 3rd ed.Publicis Corporate Publishing(2005). Google Scholar

5. 

J. Hsieh, Computed Tomography: Principles, Design, Artifacts, and Recent Advances, 3rd ed.SPIE Press(2015). Google Scholar

6. 

R. N. Bracewell, “Strip integration in radio astronomy,” Aust. J. Phys., 9 198 –217 (1956). https://doi.org/10.1071/PH560198 AUJPAS 0004-9506 Google Scholar

7. 

R. N. Bracewell, The Fourier Transform and its Applications, (1965). Google Scholar

8. 

J. W. Cooley and J. W. Tukey, “An algorithm for the machine calculation of complex Fourier series,” Math. Comput., 19 297 –301 (1965). https://doi.org/10.1090/S0025-5718-1965-0178586-1 MCMPAF 0025-5718 Google Scholar

9. 

R. N. Bracewell and A. C. Riddle, “Inversion of fan-beam scans in radio astronomy,” Astrophys. J., 150 427 –434 (1967). https://doi.org/10.1086/149346 ASJOAB 0004-637X Google Scholar

10. 

R. A. Crowther, D. J. De Rosier and A. Klug, “The reconstruction of a three-dimensional structure from projections and its application to electron microscopy,” Proc. Roy. Soc. Lond. A, 317 319 –340 (1970). https://doi.org/10.1098/rspa.1970.0119 PRLAAZ 1364-5021 Google Scholar

11. 

G. N. Ramachandran and A. V. Lakshminarayanan, “Three-dimensional reconstruction from radiographs and electron micrographs: application of convolutions instead of Fourier transforms,” PNAS, 68 2236 –2240 (1971). https://doi.org/10.1073/pnas.68.9.2236 Google Scholar

12. 

P. Gilbert, “The reconstruction of a three-dimensional structure from projections and its application to electron microscopy II. Direct methods,” Proc. R. Soc. Lond. B Biol. Sci., 182 89 –102 (1972). https://doi.org/10.1098/rspb.1972.0068 Google Scholar

13. 

G. N. Hounsfield, “A method of and apparatus for examination of a body by radiation such as X or gamma radiation,” U.K. Patent No. 1,283,915 (1968/1972).

14. 

A. M. Cormack, “Representation of a function by its line integrals, with some radiological applications,” J. Appl. Phys., 34 (9), 2722 –2727 (1963). https://doi.org/10.1063/1.1729798 JAPIAU 0021-8979 Google Scholar

15. 

W. H. Oldendorf, “Isolated flying spot detection of radiodensity discontinuities-displaying the internal structural pattern of a complex object,” IRE Trans. Bio-Med. Electron., 8 68 –72 (1961). https://doi.org/10.1109/TBMEL.1961.4322854 IRBEAM 0096-1884 Google Scholar

16. 

A. M. Cormack, “Reconstruction of densities from their projections, with applications in radiological physics,” Phys. Med. Biol., 18 195 –207 (1973). https://doi.org/10.1088/0031-9155/18/2/003 PHMBA7 0031-9155 Google Scholar

17. 

H. M. Hudson and R. S. Larkin, “Accelerated image reconstruction using ordered subsets of projection data,” IEEE Trans. Med. Imaging, 13 601 –609 (1994). https://doi.org/10.1109/42.363108 ITMID4 0278-0062 Google Scholar

18. 

E. C. Beckmann, “CT scanning the early days,” Br. J. Radiol., 79 5 –8 (2006). https://doi.org/10.1259/bjr/29444122 BJRAAP 0007-1285 Google Scholar

19. 

C. A. G. Lemay, “Method and apparatus for constructing a representation of a planar’s slice of body exposed to penetrating radiation,” U.S. Patent No. 3,924,129 (1975).

20. 

M. Joseph and R. D. Spital, “A method for correcting bone induced artifacts in computed tomography scanners,” J. Comput. Assist. Tomogr., 2 100 –108 (1978). https://doi.org/10.1097/00004728-197801000-00017 JCATD5 0363-8715 Google Scholar

21. 

R. A. Brooks et al., “Aliasing: a source of streaks in computed tomograms,” J. Comput. Assist. Tomogr., 3 511 –518 (1979). https://doi.org/10.1097/00004728-197908000-00014 JCATD5 0363-8715 Google Scholar

22. 

A. R. Sohval, “X-ray tube having an adjustable focal spot,” U.S. Patent No. 4,689,809 (1987).

23. 

G. H. Glover, “Compton scatter effects in CT reconstructions,” Med. Phys., 9 860 –867 (1982). https://doi.org/10.1118/1.595197 MPHYA6 0094-2405 Google Scholar

24. 

G. Kowalski, “Suppression of ring artifacts in CT fan-beam scanners,” IEEE Trans. Nucl. Sci., 25 1111 –1116 (1978). https://doi.org/10.1109/TNS.1978.4329487 IETNAE 0018-9499 Google Scholar

25. 

D. A. Freundlich, “Ring artifact correction for computerized tomography,” U.S. Patent No. 4,670,840 (1987).

26. 

G. H. Glover and N. J. Pelc, “An algorithm for the reduction of metal clip artifacts in CT reconstructions,” Med. Phys., 8 799 –807 (1981). https://doi.org/10.1118/1.595032 MPHYA6 0094-2405 Google Scholar

27. 

N. J. Pelc and G. H. Glover, “Method for reducing image artifacts due to projection measurement inconsistencies,” U.S. Patent No. 4,580,219 (1986).

28. 

S. C. Moore et al., “Prospectively gated cardiac computed tomography,” Med. Phys., 10 846 –855 (1983). https://doi.org/10.1118/1.595420 MPHYA6 0094-2405 Google Scholar

29. 

D. L. Parker, “Optimal short scan convolution reconstruction for fan-beam CT,” Med. Phys., 9 254 –257 (1982). https://doi.org/10.1118/1.595078 MPHYA6 0094-2405 Google Scholar

30. 

J. Hsieh and T. Flohr, “Computed tomography recent history and future perspectives,” J. Med. Imaging, 8 052109 (2021). https://doi.org/10.1117/1.JMI.8.5.052109 JMEIET 0920-5497 Google Scholar

31. 

C. R. Crawford and K. F. King, “Computed tomography scanning with simultaneous patient translation,” Med. Phys., 17 967 –982 (1990). https://doi.org/10.1118/1.596464 MPHYA6 0094-2405 Google Scholar

32. 

W. A. Kalender et al., “Spiral volumetric CT with single-breath-hold technique, continuous transport, and continuous scanner rotation,” Radiology, 176 181 –183 (1990). https://doi.org/10.1148/radiology.176.1.2353088 RADLAX 0033-8419 Google Scholar

33. 

I. Mori, “Computerized tomographic apparatus utilizing a radiation source,” U.S. Patent No. 4,630,202 (1986).

34. 

Y. Bresler and C. J. Skrabacz, “Optimal interpolation in helical scan 3D computed tomography,” in Proc. ICASSP, 1472 –1475 (1989). https://doi.org/10.1109/ICASSP.1989.266718 Google Scholar

35. 

J. H. Kinsey et al., “The DSR: a high temporal resolution volumetric roentgenographic CT scanner,” Herz, 5 177 –188 (1980). HERZDW Google Scholar

36. 

K. L. Dinwiddie, J. A. Racz and E. J. Seppi, “Rotary feed for tomographic scanning apparatus,” U.S. Patent No. 4,201,430 (1980).

37. 

G. A. Davis et al., “Axial tomographic apparatus,” U.S. Patent No. 4,093,859 (1978).

38. 

E. I. Schwartz, Juice: The Creative Fuel That Drives World-Class Invention, Harvard Business School Press(2004). Google Scholar

39. 

H. Hui, “Multi-slice helical CT: scan and reconstruction,” Med. Phys., 26 5 –18 (1999). https://doi.org/10.1118/1.598470 Google Scholar

40. 

G. L. Larson, C. C. Ruth and C. R. Crawford, “Nutating slice CT image reconstruction apparatus and method,” US Patent No. 5,802,134 (1998).

41. 

F. Noo, M. Defrise and R. Clackdoyle, “Single-slice rebinning method for helical cone beam CT,” Phys. Med. Biol., 44 561 (1999). https://doi.org/10.1088/0031-9155/44/2/019 PHMBA7 0031-9155 Google Scholar

42. 

M. Kachelriess, S. Schaller and W. Kalender, “Advanced single-slice rebinning in cone beam spiral CT,” Med. Phys., 27 754 –772 (2000). https://doi.org/10.1118/1.598938 MPHYA6 0094-2405 Google Scholar

43. 

A. Katsevich, “Exact FBP-type inversion algorithm for spiral CT,” in Proc. Sixth Int. Meeting Fully Three-Dimens. Image Reconstr. in Radiol. and Nucl. Med., 1 –4 (2001). Google Scholar

44. 

A. Katsevich, “Theoretically exact filtered backprojection-type inversion algorithm for spiral CT,” SIAM J. App. Math., 62 2012 –2026 (2002). https://doi.org/10.1137/S0036139901387186 SMJMAP 0036-1399 Google Scholar

45. 

Y. Zou and X. Pan, “Exact image reconstruction on PI-lines from minimum data in helical cone beam CT,” Phys. Med. Biol., 49 941 –959 (2004). https://doi.org/10.1088/0031-9155/49/6/006 PHMBA7 0031-9155 Google Scholar

46. 

F. Noo, R. Clackdoyle and J. Pack, “A two-step Hilbert transform method for 2D image reconstruction,” Phys. Med. Biol., 49 3903 (2004). https://doi.org/10.1088/0031-9155/49/17/006 PHMBA7 0031-9155 Google Scholar

47. 

Y. Ye et al., “A general exact reconstruction for cone-beam CT via backprojection-filtration,” IEEE Trans. Med. Imaging, 24 1190 –1198 (2004). https://doi.org/10.1109/TMI.2005.853626 ITMID4 0278-0062 Google Scholar

48. 

T. Zhuang et al., “Fan-beam and cone beam image reconstruction via filtering the backprojection image of differentiated projection data,” Phys. Med. Biol., 49 5489 –5503 (2004). https://doi.org/10.1088/0031-9155/49/24/007 PHMBA7 0031-9155 Google Scholar

49. 

P. Biswal and S. Banerjee, “Implementation of Katsevich algorithm for helical cone-beam computed tomography using CORDIC,” in Proc. Int. Conf. Syst. Med. and Biol., 313 –317 (2010). Google Scholar

50. 

D. Xia et al., “Noise properties of chord-image reconstruction,” IEEE Trans. Med. Imaging, 26 1328 –1344 (2007). https://doi.org/10.1109/TMI.2007.898567 ITMID4 0278-0062 Google Scholar

51. 

L. A. Feldkamp, L. C. Davis and J. W. Kress, “Practical cone-beam algorithm,” J. Opt. Soc. Am. A, 1 612 –619 (1984). https://doi.org/10.1364/JOSAA.1.000612 JOAOD6 0740-3232 Google Scholar

52. 

G. Wang et al., “Scanning cone beam reconstruction algorithms for x-ray microtomography,” Proc. SPIE, 1556 99 –112 (1991). https://doi.org/10.1117/12.134891 PSISDG 0277-786X Google Scholar

53. 

G. Wang et al., “A general cone beam reconstruction algorithm,” IEEE Trans. Med. Imaging, 12 486 –496 (1993). https://doi.org/10.1109/42.241876 ITMID4 0278-0062 Google Scholar

54. 

C. J. Ritchie et al., “Predictive respiratory gating: a new method to reduce motion artifacts in CT,” Radiology, 190 847 –852 (1994). https://doi.org/10.1148/radiology.190.3.8115638 RADLAX 0033-8419 Google Scholar

55. 

E. M. Haacke and J. L. Patrick, “Reducing motion artifacts in two- dimensional Fourier transform imaging,” Mag. Reson. Imaging, 4 359 –376 (1986). https://doi.org/10.1016/0730-725X(86)91046-5 MRIMDQ 0730-725X Google Scholar

56. 

C. J. Ritchie et al., “Correction of computed tomography motion artifacts using pixel-specific back-projection,” IEEE Trans. Med. Imaging, 15 333 –342 (1996). https://doi.org/10.1109/42.500142 ITMID4 0278-0062 Google Scholar

57. 

M. J. Willemink and P. B. Noël, “The evolution of image reconstruction for CT—from filtered back projection to artificial intelligence,” Eur. Radiol., 29 2185 –2195 (2019). https://doi.org/10.1007/s00330-018-5810-7 Google Scholar

58. 

J.-P. Thibault et al., “A three-dimensional statistical approach to improved image quality for multislice helical CT,” Med. Phys., 34 4526 –4544 (2007). https://doi.org/10.1118/1.2789499 MPHYA6 0094-2405 Google Scholar

59. 

X. Pan, E. Y. Sidky and M. Vannier, “Why do commercial CT scanners still employ traditional, filtered back-projection for image reconstruction?,” Inverse Prob., 25 123009 (2009). https://doi.org/10.1088/0266-5611/25/12/123009 INPEEY 0266-5611 Google Scholar

60. 

G. Wang, “A perspective on deep imaging,” IEEE Access, 4 8914 –8924 (2016). https://doi.org/10.1109/ACCESS.2016.2624938 Google Scholar

61. 

J. Hsieh et al., “A new era of image reconstruction: TrueFidelity technical white paper on deep learning image reconstruction,” (2019). https://www.gehealthcare.com/-/jssmedia/040dd213fa89463287155151fdb01922.pdf Google Scholar

62. 

J. Y. Vaishnav et al., “Objective assessment of image quality and dose reduction in CT iterative reconstruction,” Med. Phys., 41 071904 (2014). https://doi.org/10.1118/1.4881148 MPHYA6 0094-2405 Google Scholar

63. 

P. J. Pickhardt et al., “Abdominal CT with Model-Based Iterative Reconstruction (MBIR): initial results of a prospective trial comparing ultralow-dose with standard-dose imaging,” Am. J. Roentgenol., 199 1266 –1274 (2012). https://doi.org/10.2214/AJR.12.9382 AJROAM 0092-5381 Google Scholar

64. 

J.G. Fletcher et al., “Observer performance with varying radiation dose and reconstruction methods for detection of hepatic metastases,” Radiology, 289 455 –464 (2018). https://doi.org/10.1148/radiol.2018180125 RADLAX 0033-8419 Google Scholar

Biography

Patrick J. La Riviere is a professor in the Department of Radiology and the Committee on Medical Physics at the University of Chicago. He received his AB degree in physics from Harvard University and his PhD from the Graduate Programs in Medical Physics at the University of Chicago. His research interests include tomographic reconstruction in computed tomography, x-ray fluorescence computed tomography, and computational microscopy.

Carl R. Crawford is president of Csuptwo, a consulting company for medical imaging and homeland security. Previously at Analogic, he developed the reconstruction and explosive detection algorithms for a computerized tomographic (CT) scanner deployed in airports. At General Electric Medical Systems he developed technologies for helical scanning and physiological motion compensation. At Elscint he developed technology for cardiac CT scanners. He has a PhD in electrical engineering from Purdue University.

© 2021 Society of Photo-Optical Instrumentation Engineers (SPIE)
Patrick J. La Rivière and Carl R. Crawford "From EMI to AI: a brief history of commercial CT reconstruction algorithms," Journal of Medical Imaging 8(5), 052111 (6 October 2021). https://doi.org/10.1117/1.JMI.8.5.052111
Received: 28 June 2021; Accepted: 25 September 2021; Published: 6 October 2021
Lens.org Logo
CITATIONS
Cited by 8 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Reconstruction algorithms

Evolutionary algorithms

Scanners

Algorithm development

Electromagnetic coupling

Computed tomography

CT reconstruction

Back to Top