Open Access
1 July 2008 Texture features based on local Fourier histogram: self-compensation against rotation
Ursani Ahsan Ahmad, Kpalma Kidiyo, Ronsin Joseph
Author Affiliations +
Abstract
We present a method of introducing rotation invariance in texture features based on a local Fourier histogram (LFH) computed using a 1-D discrete Fourier transform (DFT). To compensate for image rotation, a local image-gradient angle at each image pixel is found from within one of the 1-D DFT coefficients. The rotation invariance is established theoretically, analytically as well as empirically. The rotation-compensated features extracted from the same texture image oriented at different angles exhibit very high cross correlation. Therefore, the proposed texture features are expected to yield very high accuracies for a variety of image data and applications. The improved LFH-based features outperform the earlier version of the features and the features based on Gabor filters in texture recognition on 8560 images from the Brodatz album.

1.

Introduction

Texture features play an important role in several image-processing applications ranging from computer vision and medical image processing to remote sensing and content-based image retrieval. Almost all the texture processing applications require rotation invariance in the texture features, which we achieve here in a very simple and cost-effective manner. Reference 1 categorizes the wide range of texture features proposed to date into two broad categories and compares them: features that use a large bank of filters or wavelets and features that use immediate pixel neighborhood properties. It shows that the latter outperforms the former. Hence, we take on improving a feature set from the latter category. In Ref. 2, texture features are extracted using a 1-D discrete Fourier transform (DFT) of the circular neighborhood around a pixel. It proposes computing a 1-D DFT of the 8-pixel sequence around each image pixel and uses magnitudes of the DFT coefficients to extract texture features. More recent work3 extracts similar texture features from the square neighborhood and calls it a local Fourier histogram (LFH)-based feature set. The LFH-based feature set was shown to perform better than the texture features extracted from a large filter bank of Gabor filters,4 which are computationally more expensive than the LFH-based features. In this work, we augment the LFH-based feature set by using the phases of the DFT coefficients as texture features as well. However, the improvement suggested herein equally applies to the texture features extracted from the circular neighborhood.2 Since the phases are sensitive to image rotation, we also present a method to make them rotation invariant. This does not cause any additional computational cost, but does improve performance.

The following sections explain how the LFH-based features are extracted, how the local image gradient angle is determined from the features themselves, and how the image gradient angle is used to compensate the features against rotation. Results are presented before concluding the paper.

2.

Method of Extracting DFT-Based Texture Features

The texture features proposed in Ref. 3 are extracted in the spatial domain by taking a 1-D DFT of the 8-pixel sequence x0 through x7 , hereafter called x , around a central pixel as shown in Fig. 1 . We use the local image gradient at the central pixel to compensate the extracted features for the effects of image rotation.

Fig. 1

9-pixel neighborhood in the spatial domain.

030503_1_027803jei1.jpg

When moving a 3×3pixel window across a texture image, the 1-D DFT of x is computed as

Eq. 1

Xk=n=07xnexp(πi4kn),
where 0k7 , Xk represents the k ’th Fourier coefficient, and xn represents the n ’th value in x . From the computed DFT, histograms of the absolute values of the first five DFT coefficients, i.e., X0 through X4 , were used for texture description in Ref. 3

The phases of the DFT coefficients X1 through X3 were also proposed as features in Ref. 3 but only for those applications that do not deal with image rotation. The phase features were otherwise excluded because, unlike magnitudes, the phases of the DFT coefficients are sensitive to image rotation. Reference 2 also proposes only magnitudes of the DFT coefficients as texture features. We propose using the histograms of phases of X2 and X3 after appropriately compensating with the local image gradient.

2.1.

Local Image Gradient

Traditionally, as a good compromise between cost and accuracy, the 3×3-pixel edge-detection operators such as the Sobel (SO) and Prewitt operators (PO)are often used to estimate local image gradient at a given pixel. Below are the general 3×3 edge-detection operators in which the value of b varies from 1, as in the PO, to 2, as in the SO:

Eq. 2

SX=[101b0b101],SY=[1b10001b1],
where SX and SY are convolved with a texture image to obtain two gradient images, GX and GY , respectively. The local image gradient angle δ is calculated as

Eq. 3

δ=tan1(GYGX).
Convolving the edge detection operators of Eq. 2 with the 3×3-pixel neighborhood of Fig. 1 gives GY and GX , which are substituted in Eq. 3 giving

Eq. 4

tanδ=x1bx2x3+x5+bx6+x7bx0+x1x3bx4x5+x7.
However, the local image-gradient angle can also be obtained from the phase of the first coefficient X1 of the DFT of x . By substituting k=1 in Eq. 1 gives

Eq. 5

tanX1=x12x2x3+x5+2x6+x72x0+x1x32x4x5+x7.

Equations 4, 5 happen to be exactly the same if b=2 and they are very similar otherwise, because the value 2 falls between the usual values of 1 and 2. For instance, the histograms of the local image-gradient angle from X1 and from the SO (b=2 ) for image D87 of the Brodatz album (BA) have a cross-correlation coefficient (XCC) of 0.97. In addition, if we consider the X1 image as a noisy version of the SO-driven image, the signal-to-noise ratio (SNR) is 69dB , verifying that the former is a very close approximation of the latter. All other images of the album were tested, and more or less similar values of correlation coefficient and SNR were found between the two approximations of the image gradient. Hence, instead of computing the local image-gradient angle using any 2-D edge-detection operators, we use the value X1 to compensate the phases of the two other DFT coefficients, i.e., X2 and X3 , against the effects of image rotation. It can now be said that δ=X1 .

2.2.

Effects of Image Rotation on Fourier Coefficients

Consider that an image is rotated by an arbitrary angle, with the center of rotation exactly in the middle of the image. The angle of rotation at any other point Pxy on the image would be different from what it is at the center of rotation. Let the angle of rotation be ψ deg at point P00 (see Fig. 1), corresponding to a shift in the string x by m places. This shift in x causes nothing but the changes in the phases of the resulting DFT coefficients. Equation 6 states the shift property of DFT:

Eq. 6

F[(xnm)]k=F[(xn)]kexp(πi4km),
where F[(xn)]k represents the k ’th coefficient of the DFT of (xn) , and F[(xnm)]k represents the k ’th coefficient of the DFT of the string (xnm) that is the same string (xn) shifted by m places. Equation 6 shows that any displacement in time or space domain causes a phase shift given by

Eq. 7

Δθk=π4km
in the Fourier domain: hence, where Δθk represents the shift in Xk . The phase shift in X1 is given by

Eq. 8

Δθ1=π4m=ψ.
Intuitively, the change in the local image-gradient angle δ is equal to the angle of rotation at point P00 (ψ) that causes equal change in X1 . Comparing Eqs. 7, 8 gives the phase shift in Xk as

Eq. 9

Δθk=k×Δθ1.
Therefore, the phases X2 and X3 are adjusted accordingly against the rotation by subtracting the local image-gradient angle δ as in Eq. 10. For k{2,3} ,

Eq. 10

ϕk=XkkX1,
where ϕk represents the rotation-compensated phase Xk , and X1 replaces δ .

3.

Experimental Results

3.1.

Rotation Invariance of the Phase Features

All the images from the BA were rotated to 30, 45, 60, and 90deg , and histograms of ϕ2 and ϕ3 were computed at each orientation. Table 1 shows the XCC as a similarity measurement between the histograms corresponding to 0deg and to 30, 45, 60, and 90deg averaged over all the images from the BA. As an example, Figs. 2 and 3 show the histograms of ϕ2 and ϕ3 , respectively, for the image D87 from BA. All the histograms appear the same and do not exhibit any left or right shift, indicating that the two phases are highly rotation invariant. We also experimented with the features extracted from the circular neighborhood suggested in Ref. 2 and found that they perform worse than those extracted from the square neighborhood.

Fig. 2

Histograms of ϕ2 for image D87 at four different orientations, θ=0 , 30°, 45, and 60deg .

030503_1_027803jei2.jpg

Fig. 3

Histograms of ϕ3 for image D87 at four different orientations, θ=0 , 30°, 45, and 60deg .

030503_1_027803jei3.jpg

Table 1

XCC between the histograms of ϕ2 and ϕ3 , respectively, corresponding to images oriented at 0deg and to those at 30, 45, 60, and 90deg averaged over all the images from the Brodatz album.

Orientation (deg) Cross correlation coefficient
30 45 60 90
Phase feature ϕ2 0.9790.9880.9840.987
ϕ3 0.9180.9070.9230.997

3.2.

Texture Recognition

Each of the 107 texture images from the BA was oriented at 0, 30, 45, 60, and 90deg . Then, 16 subimages measuring 128×128pixels were cropped from each one of the 107×5 images, giving a total of 8560 images.4 Recognition was performed on this set using the LFH-based feature set without phase features, with phase features, and with texture features based on 30 Gabor filters.4, 5 Reference 6 is a more recent work that proposes exactly the same filters but with a new distance metric that cannot be used for rotation-invariant recognition or retrieval. Table 2 presents the overall and orientation-wise texture recognition results, showing that the LFH-based features with phases perform the best in terms of accuracy and the rotation variance (RV).4

Table 2

Recognition rates relative to Orientation with 8560 Brodatz images.

Orientation (deg) % Accuracy
0 30 45 60 90 Avg.RV
Feature setLFT without ϕ 71.668.568.568.372.469.92.82
LFT with ϕ 76.273.374.974.274.274.61.52
Gabor68.865.762.864.469.466.24.27

4.

Effect of Noise

Reference 4 found that the LFT-based texture features exhibit les noise immunity than the features based on Gabor filters. However, our latest results show that the LFT-based features perform even better when extracted from images quantized to only 32 gray levels. Considering this, we expect the proposed features to be more noise resistant than these were without image-quantization as in Ref. 4.

5.

Conclusion

The earlier feature set based on LFH does not use phases of the DFT coefficients as texture features because the phases are sensitive to image orientation. To introduce rotation invariance in the features, we showed that the process of extracting phase features can be guided by the local image gradient. This was achieved by simply subtracting the local image-gradient angle obtained from the 1-D DFT itself, so that the features become self-compensating. This computationally simple and cost-effective method proved useful in making the LFH-based texture features robust against image rotation. The new feature set including the phase features exhibits more rotation invariance and yields higher recognition rates than the one without phase features.

references

1. 

M. Varma and A. Zisserman, “Texture classification: are filter banks necessary?,” 691 –698 (2003). Google Scholar

2. 

H. Arof and F. Deravi, “Circular neighborhood and 1-D DFT features for texture classification and segmentation,” IEE Proc. Vision Image Signal Process., 145 167 –172 (1998). https://doi.org/10.1049/ip-vis:19981915 Google Scholar

3. 

F. Zhou, J.-F. Feng, and Q.-Y. Shi, “Texture feature based on local fourier transform,” 610 –613 (2001). Google Scholar

4. 

A. A. Ursani, K. Kpalma, and J. Ronsin, “Texture features based on Fourier transform and Gabor filters: an empirical comparison,” 67 –72 (2007). Google Scholar

5. 

B. S. Manjunath and W. Y. Ma, “Texture features for browsing and retrieval of image data,” IEEE Trans. Pattern Anal. Mach. Intell., 18 (8), 837 –842 (1996). https://doi.org/10.1109/34.531803 Google Scholar

6. 

P. Wu, B. S. Manjunath, S. Newsam, and H. D. Shin, “A texture descriptor for browsing and similarity retrieval,” Signal Process. Image Commun., 16 33 –43 (2000). https://doi.org/10.1016/S0923-5965(00)00016-3 Google Scholar
©(2008) Society of Photo-Optical Instrumentation Engineers (SPIE)
Ursani Ahsan Ahmad, Kpalma Kidiyo, and Ronsin Joseph "Texture features based on local Fourier histogram: self-compensation against rotation," Journal of Electronic Imaging 17(3), 030503 (1 July 2008). https://doi.org/10.1117/1.2965439
Published: 1 July 2008
Lens.org Logo
CITATIONS
Cited by 8 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Feature extraction

Image filtering

Fourier transforms

Image processing

Image retrieval

Phase shifts

Remote sensing

RELATED CONTENT


Back to Top