Translator Disclaimer
30 June 2020 Underwater image restoration based on perceptually optimized generative adversarial network
Author Affiliations +

Underwater images are degraded due to the complexity of the underwater environment and medium scattering and absorption. The image degeneration includes low contrast, color distortion, and blur, which makes the task of underwater vision difficult. A perceptually optimized cycle consistency generative adversarial network (CycleGAN-VGG) is proposed to restore distorted underwater images. We adopt the framework of CycleGAN so that our method does not require pairs of distorted underwater images and corresponding ground truth images for training. Inspired by perceptual loss, we use a combination of perceptual loss, cycle-consistent loss, and adversarial loss. The multiterm loss functions guarantee that the output has the same content and structure as the input while color looks like ground truth images. In an objective assessment, our method is significantly higher than other methods in terms of colorfulness metrics and contrast metrics. It confirms that our method effectively restores the color of underwater scenes and enhances some image details and shows a superior performance when compared to other state-of-the-art methods.

© 2020 SPIE and IS&T 1017-9909/2020/$28.00 © 2020 SPIE and IS&T
Peng Wang, Haixiu Chen, Weihua Xu, and Suqin Jin "Underwater image restoration based on perceptually optimized generative adversarial network," Journal of Electronic Imaging 29(3), 033020 (30 June 2020).
Received: 19 February 2020; Accepted: 16 June 2020; Published: 30 June 2020

Back to Top