In recent years, the field of image super-resolution has mainly focused on the single-image super-resolution (SISR) task, which is to estimate an HR image from a single LR input. Due to the ill-posed ness of the SISR problem, these methods are limited to increasing the high-frequency details of the image by learning the a priori of the image. And multi-frame super-resolution (MFSR) provides the possibility to reconstruct rich details using the spatial and temporal difference information between images. With the increasing popularity of array camera technology, this key advantage makes MFSR an important issue for practical applications. We propose a new structure to complete the task of multi-frame image super-resolution. Our network takes multiple noisy images as input and generates a denoised, super-resolution RGB image as output. First, we align the multi-frame images by estimating the dense pixel optical flow between the images, and construct an adaptive fusion module to fuse the information of all frames. Then we build a feature fusion network to simultaneously fuse the depth feature information of multiple LR images and the internal features of the initial high-resolution image. In order to evaluate real-world data, We use the BurstSR data set, which includes real images of smartphones and highresolution SLR cameras, to prove the effectiveness of the proposed multiframe image super-resolution algorithm.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.