Translator Disclaimer
4 March 2019 Learning stochastic object model from noisy imaging measurements using AmbientGANs
Author Affiliations +
Abstract
The objective optimization of image-derived statistics, including the test statistic of an observer for specific decision tasks, requires a characterization of all sources of variability in the measured data. To accomplish this, it is necessary to establish a stochastic object model (SOM) that describes the variability within a group of objects to-be imaged. In order for the SOM to be realistic, it is desirable to establish it by use of experimental image data, as opposed to establishing it in a non-data-driven manner. Deep learning methods that employ generative adversarial networks (GANs) hold promise for learning SOMs that can generate images that match distributions of training image data. However, because experimental data recorded by an imaging system represent noisy and indirect measurements of the object, conventional GANs cannot be directly employed for this task. Recently, an augmented GAN architecture named AmbientGAN was proposed that can characterize a distribution of images from noisy and indirect measurements of them and knowledge of the measurement operator. In this work, for the first time, we investigate AmbientGANs for establishing SOMs by use of noisy imaging measurements. A canonical tomographic imaging system that is described by a two-dimensional Radon transform model is investigated. The AmbientGAN is evaluated by performing binary signal detection tasks that employ the generated images and true images.
Conference Presentation
© (2019) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Weimin Zhou, Sayantan Bhadra, Frank Brooks, and Mark A. Anastasio "Learning stochastic object model from noisy imaging measurements using AmbientGANs", Proc. SPIE 10952, Medical Imaging 2019: Image Perception, Observer Performance, and Technology Assessment, 109520M (4 March 2019); https://doi.org/10.1117/12.2512633
PROCEEDINGS
7 PAGES + PRESENTATION

SHARE
Advertisement
Advertisement
Back to Top