Open Access
1 May 2005 Color filter array demosaicking with local color distribution linearity
Yuanjie Zheng, Stephen Lin, Jie Yang
Author Affiliations +
Abstract
We propose a novel demosaicking method based on the linearity property of a local color distribution. With the proposed technique, the color filter array can be demosaicked with less "confetti" types of errors and fringe artifacts than many current demosaicking methods. Furthermore, edge details are well preserved.

1.

Introduction

Due to hardware limitations, the single-chip CCD or CMOS solid state sensor array in digital cameras does not measure a complete triplet of red, green, and blue color values for each pixel in an image. Instead, it captures a sparsely sampled image of each of the color planes with a sensor whose surface is covered with a color filter array (CFA). To produce a full RGB image from these subsampled color values, CFA demosaicking is then used to reconstruct the original colors.

The Bayer array1 shown in Fig. 1 is one of the many typical CFA patterns used in digital still cameras. A variety of methods have been proposed for demosaicking such a pattern. The simplest one is linear interpolation, which does not maintain edge information well. More advanced methods2 3 4 perform CFA interpolation in a manner that preserves edge details.

Fig. 1

Sample Bayer pattern.

019505j.1.jpg

A property of many local edge regions is the linearity of its color distribution in RGB space,5 which also exists for homogeneous regions. We captialize on the linearity property of local color distributions to produce a novel demosaicking method that can result in fewer demosaicking artifacts while preserving edge details better than many current demosaicking methods.

2.

Linearity Property of Local Color Distributions

As described in Ref. 5, because of the limited spatial resolution of the image array, the image plane area of an edge pixel will generally image portions of both regions that bound the edge. For an edge pixel that lies between two regions having distinct RGB color vectors I1 and I2 , its measured RGB color vector I0 should be a linear combination of the bounding region colors:

Eq. (1)

I0=αI1+(1α)I2
where α is a value within interval [0,1]. According to this local linearity property, I0 should be located on the line segment between I1 and I2 in the 3-D RGB space. The linearity property also suggests that local changes in the three color components should be consistent with one another, expressed as

Eq. (2)

r0r1r2r0=g0g1g2g0=b0b1b2b0
where rk , gk , bk represent respectively the red, green, and blue values of Ik , and r0, g0, b0 represent respectively the red, green, and blue values of I0.

In this work, only three consecutive pixels on a line in the CCD array tessellation are regarded as complying with the linearity property. For example, in Fig. 1, I21, I22, and I23 should be linear with regard to 4-connectivity, and I11, I22, and I33 should be linear in the sense of 8-connectivity.

3.

Linearity in Demosaicking

The linearity property shown in Eq. (2) describes expected relationships among the color components of neighboring pixels. Missing components can be determined by incorporating the linearity property into the demosaicking problem.

The green channel is first interpolated. Referring to Fig. 1, we estimate G34 of a red CFA pixel by first computing α1=|G35−G33|, α2=|G44−G24|, β1=|B43−B25|, and β2=|B45−B23|. These quantities are used to determine whether pixel I34 is located on a vertical, horizontal, or diagonal edge. The following estimates are then used for the missing green pixel value:

Eq. (3)

G34={(G33+G35)/2if α1=MP(G24+G44)/2if α2=MP(G24+G33)/2if [(β1=MP)&(|Bavg1B23|<|Bavg1B45|)](G35+G44)/2if [(β1=MP)&(|Bavg1B45|<|Bavg1B23|)](G24+G35)/2if [(β2=MP)&(|Bavg2B25|<|Bavg2B43|)](G33+G44)/2if [(β2=MP)&(|Bavg2B43|<|Bavg2B25|)]
where MP=min1212), Bavg1=(B25+B43)/2 and Bavg2=(B23+B45)/2. In Eq. (3), the last four cases correspond to diagonal edges. For example, a diagonal edge from the lower left to upper right is addressed in the third and fourth cases. For this kind of edge, I34 is first grouped to either the upper left or lower right triangle formed by the edge in the 8-neighborhood, depending on which triangle has the more similar blue value. Then G34 is estimated by the known green values in the selected triangle. The green channel value for a blue CFA pixel can be interpolated similarly.

After demosaicking the green color plane, the blue and red values of green CFA pixels are then estimated using the linearity property as follows, using I44 as an example.

Eq. (4)

B44={(B45+TB*B43)/(1+TB)(TB1)&(TBInf)(G44/G43)B43(TB=1)&(G430)B43(TB=1)&(G43=0)B43TB=Inf
where
TB={(G45G44)/(G44G43)G44G43InfG44=G43
R44 can be determined similarly to Eq. (4) by the known green and red components value of I54, I44, and I34.

Linearity is also used to estimate the missing red values for blue CFA pixels, and the blue values for red CFA pixels. Using I54 as an example, the blue value of a red CFA pixel is interpolated as

Eq. (5)

B54=(B54H+B54V)/2
where B54 H is estimated from pixels I53, I54, and I55 with the method in the Eq. (4), and B54 V is determined similarly from pixels I44, I54, and I64.

4.

Results

In our experiments, all test images are sampled with the Bayer CFA pattern and then reconstructed using demosaicking methods under comparisons in RGB color space.

In Fig. 2, we display the results of the Hamilton method,3 the Gunturk method,2 bilinear interpolation, and our method on a real color image. For greater clarity, we highlight a patch in the image and zoom in to obtain a larger scale. Bilinear interpolation produces many “confetti” types of artifacts. Fringe artifacts, also known as zipper artifacts, are obvious in the results of the Gunturk method. For this image, the Hamilton method performs as well as our method, both having much fewer artifacts.

Fig. 2

Demosaicked results by some current demosaicking methods and ours on a real image.

019505j.2.jpg

More than 50 real images were tested in our experiments, and we found our method to be less susceptible to edge artifacts than these selected state-of-the-art demosaicking methods2 3 4 in most cases. At the same time, our method reasonably preserves edge details. Some of the test images and demosaicking results are available on our webpage.6

Acknowledgments

This work was done while the first author was an intern at Microsoft Research Asia.

REFERENCES

1. 

B. E. Bayer, “Color imaging array,” U.S. Patent No. 3971065 (1976).

2. 

B. K. Gunturk , Y. Altunbasak , and R. M. Mersereau , “Color plane interpolation using alternating projections,” IEEE Trans. Image Process. , 11 (9), 997 –1013 (2002). Google Scholar

3. 

J. F. Hamilton and J. E. Adams, “Adaptive color plane interpolation in single sensor color electronic camera,” U.S. Patent No. 5629734 (1997).

4. 

R. Ramanath , W. E. Snyder , and G. L. Bilbro , “Demosaicking methods for Bayer color arrays,” J. Electron. Imaging , 11 (3), 306 –315 (2002). Google Scholar

5. 

S. Lin, J. Gu, S. Yamazaki, and H.-Y. Shum, “Radiometric calibration from a single image,” Proc. IEEE Computer Society, Conference on Computer Vision and Pattern Recognition, pp. 938–945 (2004).

6. 

www.pami.sjtu.edu.cn/people/yjzheng/research-Demosaicking.htm
©(2005) Society of Photo-Optical Instrumentation Engineers (SPIE)
Yuanjie Zheng, Stephen Lin, and Jie Yang "Color filter array demosaicking with local color distribution linearity," Optical Engineering 44(5), 050501 (1 May 2005). https://doi.org/10.1117/1.1906084
Published: 1 May 2005
Lens.org Logo
CITATIONS
Cited by 2 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Optical filters

RGB color model

CCD image sensors

Image processing

Image sensors

Pattern recognition

Sensors

RELATED CONTENT

A high resolution hand-held focused beam profiler
Proceedings of SPIE (May 16 2017)
Multi-spectral skin imaging by a consumer photo-camera
Proceedings of SPIE (February 23 2010)
Process simulation in digital camera system
Proceedings of SPIE (May 01 2012)
Using CMOS image sensors to detect photons
Proceedings of SPIE (May 04 2010)

Back to Top