Translator Disclaimer
7 May 2020 Learning a deep fully connected neural network from a single image for edit propagation
Author Affiliations +
Abstract

We introduce a simple but effective deep fully connected neural network (FNN) to solve the edit propagation as a multiclass pixel-level classification task. We construct the feature space using three- (or one-) dimensional normalized RGB (or grayscale) vectors and spatial coordinates. Our deep FNN-based model consists of color feature extraction, spatial feature extraction, feature combination, and classifier estimation. We train our model only using the feature within the region labeled by the user with strokes in a single image. Then our method directly outputs the edit propagation results after the forward pass without any refinement process. Our method automatically determines the importance of each image feature across the whole image jointly considering feature vectors from user strokes. Extensive experiments demonstrate that the proposed algorithm achieves superior performance over state-of-the-art methods, yet it remains simple and efficient.

© 2020 SPIE and IS&T 1017-9909/2020/$28.00 © 2020 SPIE and IS&T
Xujie Li and Yandan Wang "Learning a deep fully connected neural network from a single image for edit propagation," Journal of Electronic Imaging 29(3), 033002 (7 May 2020). https://doi.org/10.1117/1.JEI.29.3.033002
Received: 24 December 2019; Accepted: 22 April 2020; Published: 7 May 2020
JOURNAL ARTICLE
14 PAGES


SHARE
Advertisement
Advertisement
RELATED CONTENT


Back to Top