Open Access
13 June 2014 Elastic registration for airborne multispectral line scanners
Feng Li, Chuanrong Li, Lingli Tang, Yi Guo
Author Affiliations +
Abstract
The multispectral line scanner is one of the most popular payloads for aerial remote sensing (RS) applications. Scanners with large field of view (FOV) improve efficiency in Earth observation. Small-volume instruments with a short focal length and a large FOV, however, may bring a problem: different nonlinear warping and local transformation exist between bands. Alignment accuracy of bands is a criterion impacting product quality in RS. A band-to-band elastic image registration method is proposed for solving the problem. Rather than ignoring the intensity variation and carrying out an intensity-based registration between bands straightforwardly, we construct feature images and use them to conduct an intensity-based elastic image registration. In this method, the idea of the inverse compositional algorithm is employed and expanded when dealing with local warping, and a smoothness constraint is also added in this procedure. Experimental results show that the proposed band-to-band registration method works well both visually and quantitatively. The outstanding performance of the method also encourages potential applications for other new types of airborne multispectral imagers.

1.

Introduction

Remote sensing (RS) is developing toward high spatial resolution, high time resolution, and high spectral resolution. Multispectral images of line scanners are the main type of data acquired by RS sensors in Earth observation. Unlike space mission applications, platforms of aerial applications suffer more effects such as pitch, yaw, and roll. Boresight error consists of yaw error, pitch error, and roll error. For an airborne line scanner, i.e., pushbroom scanner, pitch error will make the whole line data see either forward or backward instead of nadir. Roll error will shift the whole line data either to the left or the right. Yaw error may cause the whole line data to rotate along the flight direction. If the boresight error is constant during the flight, it is easy to correct. However, in reality the boresight error varies from time to time caused by air turbulence, wind, and platform vibration. These irregular effects can bring a problem for airborne multispectral line scanners: raw RS multispectral bands may be not innately registered. Alignment accuracy of multispectral bands from airborne scanners is a criterion impacting product quality in RS. Many scholars have been investigating to solve the band-to-band registration problem.

There are dozens of intensity-based image registration methods, but none of them can be used to solve the multispectral image registration problem in a straightforward manner. The reason is that different spectral characteristics of ground lead to intensity variation between bands. Especially, visible light images and infrared light images of the same scene will be regarded as totally different images in term of intensity. There are some algorithms13 to solve the band-to-band registration problem for the Enhanced Thematic Mapper Plus and moderate-resolution imaging spectroradiometer sensors. In general, the scale-invariant feature transform (SIFT) algorithm4 gives excellent matching results for visible image pairs. For infrared and visible image pairs, however, the number of mismatched SIFT keypoints increases rapidly, and this decreases the SIFT-based registration accuracy drastically. Therefore, authors in Ref. 5 proposed a new scale restriction criteria for keypoint matching with better performance. Moreover, there are also some algorithms based on mutual information6 and statistical learning techniques7 to solve the band-to-band rigid/global registration problem. A correlation and Hough transform-based method of automatic image registration was proposed in Ref. 8. The authors claimed that the method can globally register a pair of images with different spectral contents. Common elastic image registration methods9,10 are based on the intensity of the images. Since intensity of the same region varies according to different spectral characteristics of ground, those elastic methods often fail. Moreover, high computational cost of Periaswamy’s9,10 method limits its application for RS images. Therefore, a new and efficient elastic band-to-band registration is needed to solve the problem.

Recently, a new type of multispectral line scanner developed by Changchun Institute of Optics in China is composed of four off-axis aspheric reflectors in order to achieve a wide field of view (FOV) with a short focal length. The new multispectral scanner can achieve an FOV of 61.93 and IFOV of 0.18 mrad with four bands: blue (420 to 520 nm), green (520 to 600 nm), red (630 to 690 nm), and near-infrared (760 to 900 nm). A picture of the multispectral line scanner is shown in Fig. 1. The physical dimensions of the scanner are about 150×220×200(mm). This kind of scanner is vulnerable to the problem of different nonlinear warping and inconsistent local transformation between bands. This is caused by air turbulence, self-propelled vibration of airborne platforms, and the systematic design of the multispectral line scanner. As mentioned earlier, common global image registration methods are not suitable for this problem. One example image taken by this scanner is shown in Fig. 2. We can see that each band suffers different nonlinear warping, although the flight was carried out under good weather conditions. Based on our survey results, most of the band-to-band registration methods can only solve the misalignment globally, and they fail to register bands with local warps shown in Fig. 2. This is problem is not brought only by this particular multispectral line scanner. Another new type of spectrometer was proposed and the technique is abbreviated as IRIS.11,12 Based on the IRIS design scheme, all multispectral bands are projected on a single conventional detector array, and this kind of spectrometer may also suffer the above problem when it is used for airborne applications.

Fig. 1

Picture of the multispectral line scanner.

JARS_8_1_083614_f001.png

Fig. 2

An example of captured bands with the new multispectral line scanner. (a) Blue band, (b) green band, and (c) red band.

JARS_8_1_083614_f002.png

In this paper, a new elastic band-to-band registration method is proposed to solve the above problem. The rest of this paper is organized as follows. Feature images are introduced in Sec. 1, then a new elastic image registration method is proposed in Sec. 2. In Sec. 4, we demonstrate the high accuracy of registration between bands with the new method for some real flight data. Finally, conclusions are provided in Sec. 5.

2.

Methodology

2.1.

Feature Images

Nonrigid warps between multispectral bands require elastic registration methods considering both light intensity variation and local warping. Human eyes are very sensitive to edges rather than smooth areas of a scene. This is the reason humans can easily recognize the same region, although the region of intensity is different across different wavelength ranges. Considering the characteristic of human eyes, we mimic its edge detection by introducing a gradient operator to explore the similarities within multispectral images. We define feature images as follows:

Eq. (1)

Fk(x,y)=(Ik(x,y)x)2+(Ik(x,y)y)2,
where Ik(x,y) denotes the k’th channel of multispectral images. After normalizing the feature image for each band, edges of images will be emphasized. The feature images of the bands in Fig. 2 are shown in Fig. 3. Rather than carry out registration between bands straightforwardly, we use feature images constructed for each band to conduct an intensity-based elastic image registration. Before the registration step, a histogram-matching step is applied to the reference feature image in order to match the histogram of the target feature image in terms of intensity.

Fig. 3

An example of feature images of the bands shown in Fig. 2.

JARS_8_1_083614_f003.png

2.2.

Feature Images-Based Elastic Image Registration

Because the feature images summarize the major transition of the regions by exploring the similarity within bands, we align the corresponding feature images instead of multispectral images with a new elastic image registration method. We denote the reference and source feature images as T(x,y) and S(u,v), respectively.

In this paper, the idea of the inverse compositional algorithm13,14 is adopted and expanded in dealing with both rigid and local warping with high efficiency. To improve the efficiency, we warp the target image toward the source image so that a Hessian matrix can be obtained outside of the iterations, which avoids the clumsy step-by-step warping with a Taylor series expansion as in Ref. 9.

2.2.1.

Global affine model

We use a seven-dimensional vector to describe the warp and the brightness variations between the source and reference images. The motion between them can be modeled by an affine transform

Eq. (2)

T(x,y)=S(u,v)+m7=S(m1x+m2y+m5,m3x+m4y+m6)+m7,
where m1, m2, m3, and m4 are the linear affine parameters, m5 and m6 are the translation parameters, and m7 denotes the variation of intensity between the two images; u=m1x+m2y+m5 and v=m3x+m4y+m6.

Equation (2) can also be denoted as T(x,y)=S[W(x,y;M)]+m7, where

Eq. (3)

W(x,y;M)=[m1m2m5m3m4m6001][xy1]
is used to express the affine warp to the coordinate frame and M is the affine parameters matrix
[m1m2m5m3m4m6001].

In general, the process of image registration is to keep the source image aligned with the reference image; therefore, warping the source image toward the reference image step-by-step in each iteration is very common. In contrast, in our method, the reference image is warped toward the source image step-by-step and the inverse of the final warping parameters is applied to the source image to complete the registration.

We set m˜=[m1˜,m2˜,m3˜,m4˜,m5˜,Δm6,m7]T and

ΔM=[m1˜m2˜m5˜m3˜m4˜m6˜001],
where ΔM=M1. Therefore, by warping the reference image T(u,v) with the parameter set ΔM, the reference image will be aligned with the source image S(x,y). Then, we have

Eq. (4)

T[W(u,v;ΔM)]=T(m1˜u+m2˜v+m5˜,m3˜u+m4˜v+m6˜)=S(u,v)+m7.

The cost function is as follows:

Eq. (5)

E(m˜)=u,vΩ[T(m1˜u+m2˜v+m5˜,m3˜u+m4˜v+m6˜)S(u,v)m7]2,
where Ω denotes the image region.

The goal is to minimize the above cost function to calculate ΔM, which is the inverse of the affine parameter matrix M. If T[W(u,v;ΔM)] is expanded using a first-order truncated Taylor series, then

Eq. (6)

E(m˜)u,vΩ[T+(m1˜u+m2˜v+m5˜u)Tu+(m3˜u+m4˜v+m6˜v)TvSm7]2.

T and S are symbolized as lexicographically ordered vectors of T(x,y) and S(x,y) of length N2, where N2 denotes the size of images. Tu and Tv are the spatial derivatives of T(u,v) and are symbolized as lexicographically ordered vectors of length N2. Then the error function can be simplified as follows:

Eq. (7)

E(m˜)=kcTm˜22,
where k and vector c are given by

Eq. (8)

k=ST+uTu+vTv
and

Eq. (9)

c=(uTu,vTu,uTv,vTv,Tu,Tv,e)T,
where e denotes a vector in which all components are 1. The size of c is 7×N2.

Then the error function in Eq. (7) can be minimized by differentiating E(m˜) with respect to the unknowns m˜ as

Eq. (10)

dE(Δm)dΔm=2c[kcTΔm].

Then

Eq. (11)

Δm=[ccT]1[ck].

Note that Δm in Eq. (11) is the warping parameter set for T(x,y) to be aligned with S(u,v), so the affine transform matrix M for S(u,v) is obtained as follows:

Eq. (12)

M=ΔM1.

The initial M is a 3×3 identity matrix. Initially, m7 is set to 0 and is updated at each iteration. After several iterations, the source image will be warped by M to match the reference image.

Note that, compared with Periaswamy’s method, [ccT] (the Hessian matrix of the reference image) is a constant matrix, because there is nothing in this matrix that depends on iteratively warped S, so it can be precomputed. The additional step is to calculate Eq. (12) at each iteration. Compared with the calculation of [ccT], the computational cost of the additional step, i.e., M=ΔM1 is very small, because ΔM is a 3×3 matrix only. N2 is set to be the number of pixels in the reference image, and then the saved cost in each iteration is O(72N2). In this way, the efficiency of Periaswamy’s algorithm has been improved in processing the global affine warp.

In dealing with large distortion between the source image and the target image, a coarse to fine scheme is adopted. A Gaussian pyramid as shown in Fig. 4 is built for both the source image and the reference image, and the global affine parameters are calculated from the coarsest level.

Fig. 4

The registration starts from coarser layers to finer layers in the image pyramid.

JARS_8_1_083614_f004.png

2.2.2.

Local shift model

Two images can be coarsely aligned after the global affine transform mentioned above. Now, a local warping between local regions needs to be considered. As in many other methods, the two images are divided into many boxes to deal with the local warping individually. The smaller the box size, the better the complex nonrigid warping can be approximated by translation. Therefore, a three-dimensional vector is adopted to describe the translation and brightness variation between the source image box and the reference image box.

Let BT(x,y) and BS(u,v) denote the reference and source image boxes, respectively. The motion between them can be modeled by a shift transform

Eq. (13)

BT(x,y)=BS(x+m5,y+m6)+m7=BS(u,v)+m7,
where u=x+m5 and v=y+m6. Then, the above equation can be rewritten as

Eq. (14)

BT(um5,vm6)=BS(u,v)+m7.

We can either shift the source box a particular amount to align it with the reference box, or shift the reference box in the inverse direction by the same amount to keep them aligned. The latter strategy is adopted to improve the efficiency as described in Sec. 2.2.1. Then, the cost function is defined by

Eq. (15)

Ebox(m)=u,vΩ[BT(um5,vm6)BS(u,v)m7]2,
where Ω denotes the box and m=[m5,m6,m7]T.

The goal is to minimize this expression to calculate the translation parameters and the intensity variation. BT(um5,vm6) is expanded using a first-order truncated Taylor series, thus

Eq. (16)

Ebox(m)[BTm5BTum6BTvBSm7]2,
where BTu and BTv are the spatial derivatives of BT(u,v) in lexicographically ordered vectors, BT and BS are symbolized as lexicographically ordered vectors of BT(x,y) and BS(x,y), respectively. The error function can be simplified as follows:

Eq. (17)

Ebox(m)=kcTm22,
where k and c are given by

Eq. (18)

k=BTBS
and

Eq. (19)

c=[BTu,BTv,e]T,
where e denotes a vector in which all components are 1.

The cost function in Eq. (17) can be minimized by differentiating Ebox(m) with respect to the unknowns m by

Eq. (20)

dEbox(m)dm=u,vΩ2c[kcTm].

Then

Eq. (21)

m=[ccT]1[ck].

As we can see, the Hessian matrix [ccT]1 is a constant matrix, because there is nothing in this matrix that depends on the shifted source box, so it can be precomputed. Once it is calculated, it can be saved in memory for use in each iteration. On the contrary, the vector [ck] has to be calculated in each iteration. After a few iterations, the best m is obtained for Eq. (17) by making Ebox(m) minimum between two boxes.

2.2.3.

Smoothness constraint

Usually, the motion parameters vary smoothly in most parts of the image. We assume that neighbor pixel locations may have constant/smooth motion parameters. This provides a very useful constraint for obtaining meaningful local alignment parameters. During every local registration iteration introduced above, to avoid a large difference between neighbor parameter sets and to express the movement parameters with intensity variation only, a smoothness constraint is applied to these parameters.

As in Refs. 9 and 15, a similar smoothness constraint is set up. The error function of every pixel is described as follows:

Eq. (22)

Epix(m)=Exy(m)+Esm(m),
where Exy(m) is defined as in Eq. (17) centralizing (x,y) as

Eq. (23)

Exy(m)=[kcTm]2
with k, c, and m the same expression as shown in Eq. (17). The new term Esm(m) denotes the smoothness constraint as in9

Eq. (24)

Esm(m)=e=57ζe[(mex)2+(mey)2],
where ζe is a prespecified positive constant which adjusts the relative weight given to the smoothness constraint on the parameter me. The larger ζe, the weaker the rigidity, i.e., more weight on the smoothness. By differentiating Eq. (22) with respect to the model parameters and setting the gradient to zero, we can minimize the error function in Eq. (22) as follows:

Eq. (25)

dEpix(m)dm=dExy(m)dm+dEsm(m)dm=0.

The derivative of Exy(m) is

Eq. (26)

dExy(m)dm=2c[kcTm].

To compute the derivative of [dEsm(m)]/(dm), the method used in Refs. 9 and 15 is chosen by using the horizontal and vertical derivatives of me(x,y) as

Eq. (27)

me(x,y)x=me(x,y)me(x+1,y),me(x,y)y=me(x,y)me(x,y+1),
where me(x,y) is the e’th image registration parameter of the coordinate of (x,y).

Then, Eq. (24) can be written as

Eq. (28)

Esm[me(x,y)]=ζe{[me(x,y)me(x+1,y)]2+[me(x,y)me(x,y+1)]2}.

Therefore, {dEsm[me(x,y)]}/[dme(x,y)] can be approximated as

Eq. (29)

dEsm[me(x,y)]dme(x,y)=2ζe{me(x,y)me(x+1,y)+me(x,y)me(x,y+1)}=4ζe[me(x,y)me¯(x,y)],
where me¯(x,y)=[me(x+1,y)+me(x,y+1)]/2, or the local average of me¯ at the coordinate of (x,y).

Using vector notation, the derivative of Esm(m) is denoted by discrete approximations9

Eq. (30)

dEsm(m)dm=2L(mm¯),
where m¯ is the component-wise average of m over a small spatial neighborhood and L is a 3×3 diagonal matrix with diagonal elements ζe, and zeros off diagonal. L is the parameter matrix that weights the error in the local shift model registration relative to the departure from smoothness. ζe should be small if the parameter set calculation from Exy is accurate and large if there are errors with parameter set calculation caused by the intensity variation setting {[dExy(m)]/dm}+{[dEsm(m)]/dm}=0, we have

Eq. (31)

m(j+1)=(ccT+L)1(ck+Lm¯(j)).

At each iteration j, m¯(j) can be estimated from the current m(j). The initial m(0) can be estimated from Eq. (21).

Here, ζe=1 in our experimental tests, and the selection of the size of local box Ω depends on particular images. The smaller the box size, the more the elastic warping is fixed; however, the weaker the constraint that is applied. A larger box size means that more pixels provide constraints on the mean square error minimization. Therefore, it shows more rigid characteristics. A smaller box size brings more freedom with less constraint, but it also brings more error because the matrix [cmT] may not be invertible. There is a natural tradeoff in choosing the box size.

3.

Overview and Implementation Details

Here, the overall structure of the new elastic image registration for multispectral images will be reviewed. First, feature images of the multispectral bands are created. One of the feature images is selected as a reference image and other feature images as source images. Then, the image pyramid is constructed from the two input feature (source and reference) images. The new image registration algorithm starts from the coarsest layer images. The global affine image registration is applied to these two coarsest source and reference feature images. After a certain number of global iterations, a parameter set ΔM will be calculated. Because ΔM=M1, M can be calculated, which is used to warp the source image to the reference image for this current layer. Then it brings this warped source image and the reference image into the local shift registration procedure. In each local iteration, two images are divided into boxes, and the shift parameters are calculated for each coordinate position.

After each local shift registration iteration, a number of smoothness constraint iterations follow to avoid some unexpected errors. Once the local shift registration procedure and the smoothing procedure finish, the warping parameters within that layer are interpolated by a factor of 2 for the next finer-layer image until reaching the resolution of the original source and reference images. In order to avoid the artifacts of the bilinear interpolation in each warping procedure for the source image pyramid, an accumulated warp is finally applied to the original source image pyramid layer at each level. For example, if 15 global iterations were to be adopted, there would be 15 steps in warping the source image layer toward the reference image layer at each scale. Therefore, 15 times bilinear interpolation would be sequentially applied to the same source image layer. Then, the blur artifacts would be extremely severe. Therefore, rather than continuously warping and interpolating the same source image layer, the warping parameter set is accumulated step-by-step and the overall warping parameter set is applied to the original source image layer in a single warp at the end for each source layer image.

Based on the feature images of both the reference and the source, the pixel translation of each position: m5 and m6 can be calculated between the reference feature image and the source feature image. Finally, we warp (i.e., a grid step plus a bilinear interpolation step) the source band toward the reference band with the precalculated translation values m5 and m6. The above operation should be routinely carried out for each band. Eventually, the whole multispectral band-to-band registration is completed.

To help the readers better understand the whole process of our method, a flow chart is shown in Fig. 5. Moreover, MATLAB code and other details of our algorithm can be found at http://code.google.com/p/b2bregistration.

Fig. 5

Flow chart of our proposed registration method.

JARS_8_1_083614_f005.png

4.

Experimental Results

In order to evaluate the performance of the proposed method, we tested our method with real flight data. By reforming an unmanned aerial vehicle (UAV), multipayloads such as the new type of wide-view multispectral line scanner introduced above, a hyperspectral camera and a panchromatic array camera are integrated into the vehicle. The flight was conducted in a testing site (for calibration and validation) near BaoTou city of China on September 3, 2011. The total flight time was over 10 h with three different flight altitudes. One of the test data sets acquired by the new multispectral line scanner without the geometric correction procedure is shown in Fig. 2. The true color image shown in Fig. 6(a) indicates the misalignment between bands. By setting the red band image Fig. 2(c) as a reference, we register the blue band Fig. 2(a) and the green band Fig. 2(b) to the reference, respectively. The true color image after the elastic registration is shown in Fig. 6(b). We can see that the true color image shows no misalignment between bands. Note that the true color image still suffers from some geometric distortion. This is because geometric distortion exists in the red band, which is the reference.

Fig. 6

The true color images before and after the elastic registration with the test data shown in Fig. 2. (a) Before registration and (b) after registration.

JARS_8_1_083614_f006.png

To decrease the geometric distortion in the reference image, we took images captured by the array camera in another flight track as the reference. Moreover, a geometric correction step was used to achieve the goal. In our flight test, payloads were equipped on an airborne photoelectric stabilized platform. Given the position and orientation system (POS) parameter and the compensation parameters from the airborne photoelectric stabilized platform, a geometric correction step can be applied to the raw image data to decrease the geometric error brought by the UAV attitude. Images captured by the panchromatic array camera generally suffer less local warps caused by the vibration of the platform than those from line scanner cameras. The reason is that each line data may suffer different POS and photoelectric stabilized platform parameters during the exposure period. A further test image captured by the array camera on the same day but in another track of the flight is shown in Fig. 7(a) with size of 1024×1024pixels. An enlarged region of (a) is shown in Fig. 7(b). Images of the same region of the multispectral data after geometric correction are shown in Figs. 7(c)7(f) of blue, green, red, and near-infrared, respectively. From Figs. 7(c)7(f), we can see that there are still some nonlinear local warps between bands, although they are much less than those in the data shown in Fig. 2. By setting the image from the array camera as the reference band, the same region of the registered images of blue, green, red, and near-infrared are shown in Figs. 7(g)7(j), respectively. Moreover, to help the readers have a visual comparison, the true color image of the same region before and after the elastic registration procedure are shown in Figs. 7(k) and 7(l), respectively. We can see that the true color image of Fig. 7(l) contains less misalignment between registered bands and less geometric distortions. In this experiment, Ω=40×40 was selected as the box size for solving local warps.

Fig. 7

An example by registering an image captured by an array camera onboard. (a) Whole image from array camera, (b) subregion of (a), (c) blue band, (d) green band, (e) red band, (f) near-infrared band, (g) registered blue band, (h) registered green band, (i) registered red band, (j) registered near-infrared band, (k) true color image (before), and (l) after registration.

JARS_8_1_083614_f007.png

To demonstrate the performance of our band-to-band registration method quantitatively, we carried out a numerical comparison by evaluating the misalignment between the registered bands and the panchromatic image before and after registrations. We randomly selected 11 testing points on edges or corners in the whole images to ensure that these points are identifiable for all bands. This is because testing points from smooth regions cannot be uniquely identified by either human eyes or evaluation methods. After selecting the testing points, a paraboloid surface fit method16 was adopted to evaluate the registration accuracy in subpixels. The mean of the horizontal and vertical misalignments of the 11 points between bands and the reference (the panchromatic image captured by the onboard array camera) is shown in Table 1. Both visual and numerical comparison show that our elastic band-to-band registration method works well for solving different nonlinear warping and local translations between bands.

Table 1

Numerical comparison results before and after the registration.

BandsHorizontal shift average value in pixelsVertical shift average value in pixels
Before registrationAfter registrationBefore registrationAfter registration
Blue band4.790.200.710.09
Green band3.550.111.020.08
Red band3.320.090.660.07
Near-infrared band4.030.210.590.25

5.

Conclusions

In this paper, a new elastic band-to-band registration method is proposed for solving local warps between bands. Gradient-based feature images are constructed as intermediate products to assist the registration. Instead of registering multispectral bands straightforwardly, we apply a new intensity-based elastic registration method to register those feature images to calculate their corresponding warping parameters. Experimental results of real data have shown that the proposed method can correct local warps in band-to-band registration with satisfactory performance. With a growing number of real applications of the new multispectral line scanner of large FOV and short focal length, this band-to-band registration method will certainly attract more attention.

Acknowledgments

Feng Li acknowledges the support of NSFC under Grant 41371415. Yi Guo is supported by the CSIRO Minerals Down Under Research Flagship and Capability Development Fund.

References

1. 

J. BarkerJ. Seiferth, “Landsat thematic mapper band-to-band registration,” in Int. Geoscience and Remote Sensing Symposium, 1996 (IGARSS’96) Remote Sensing for a Sustainable Future, 1600 –1602 (1996). Google Scholar

2. 

K. Yanget al., “Modis band-to-band registration,” in Proc. IEEE 2000 Int. Geoscience and Remote Sensing Symposium, 2000 (IGARSS 2000), 887 –889 (2000). Google Scholar

3. 

R. EastmanJ. Le MoigneN. Netanyahu, “Research issues in image registration for remote sensing,” in IEEE Conf. on Computer Vision and Pattern Recognition, 2007 (CVPR’07), 1 –8 (2007). Google Scholar

4. 

D. Lowe, “Distinctive image features from scale-invariant keypoints,” Int. J. Comput. Vision, 60 (2), 91 –110 (2004). http://dx.doi.org/10.1023/B:VISI.0000029664.99615.94 IJCVEQ 0920-5691 Google Scholar

5. 

Z. YiC. ZhiguoX. Yang, “Multi-spectral remote image registration based on sift,” Electron. Lett., 44 (2), 107 –108 (2008). http://dx.doi.org/10.1049/el:20082477 ELLEAK 0013-5194 Google Scholar

6. 

J. KernM. Pattichis, “Robust multispectral image registration using mutual-information models,” IEEE Trans. Geosci. Remote Sens., 45 (5), 1494 –1505 (2007). http://dx.doi.org/10.1109/TGRS.2007.892599 IGRSD2 0196-2892 Google Scholar

7. 

T. KimM. Choi, “Fast multi-spectral image registration based on a statistical learning technique,” Proc. SPIE, 7810 78100C (2010). http://dx.doi.org/10.1117/12.860437 IGRSD2 0196-2892 Google Scholar

8. 

H. Gonçalveset al., “CHAIR: automatic image registration based on correlation and Hough transform,” Int. J. Remote Sensing, 33 (24), 7936 –7968 (2012). http://dx.doi.org/10.1080/01431161.2012.701345 IJSEDK 0143-1161 Google Scholar

9. 

S. PeriaswamyH. Farid, “Elastic registration in the presence of intensity variations,” IEEE Trans. Med. Imaging, 22 (7), 865 –874 (2003). http://dx.doi.org/10.1109/TMI.2003.815069 ITMID4 0278-0062 Google Scholar

10. 

S. PeriaswamyH. Farid, “Elastic registration with partial data,” Biomedical Image Registration, 102 –111 Springer, Berlin, Heidelberg (2003). Google Scholar

11. 

A. Harveyet al., “Spectral imaging in a snapshot,” Proc. SPIE, 5694 110 –119 (2005). http://dx.doi.org/10.1117/12.604609 Google Scholar

12. 

A. GormanD. Fletcher-HolmesA. Harvey, “Generalization of the Lyot filter and its application to snapshot spectral imaging,” Opt. Express, 18 (6), 5602 –5608 (2010). http://dx.doi.org/10.1364/OE.18.005602 OPEXFF 1094-4087 Google Scholar

13. 

S. BakerI. Matthews, “Lucas-Kanade 20 years on: a unifying framework,” Int. J. Comput. Vision, 56 (3), 221 –255 (2004). http://dx.doi.org/10.1023/B:VISI.0000011205.11775.fd IJCVEQ 0920-5691 Google Scholar

14. 

G. Yeet al., “Efficient multi-image registration with illumination and lens distortion correction,” in IEEE Int. Conf. on Image Processing, 2005 (ICIP 2005), III-1108 –III-1111 (2005). Google Scholar

15. 

B. Horn, Robot Vision, MIT Press, Cambridge, MA (1986). Google Scholar

16. 

S. S. GleasonM. A. HuntW. B. Jatko, “Subpixel measurement of image features based on paraboloid surface fit,” Proc. SPIE, 1386 135 –144 (1991). http://dx.doi.org/10.1117/12.25387 IGRSD2 0196-2892 Google Scholar

Biography

Feng Li received his PhD degree from the University of New South Wales in 2009. He joined CSIRO in the same year and developed algorithms for the applications of compressive sensing to radio astronomy. Currently, he is working at the Academy of Opto-electronics, Chinese Academy of Sciences. His research interests are in general image processing, image restoration, and compressive sensing.

ChuanRong Li is a professor at the Academy of Opto-electronics, Chinese Academy of Sciences. His research is mainly on remote sensing data processing, geometric correction, image processing, and remote sensing data calibration and validation.

LingLi Tang is a professor at the Academy of Opto-electronics, Chinese Academy of Sciences. Her research is mainly on remote sensing data preprocessing, SAR data processing, image processing, and remote sensing data calibration and validation.

Yi Guo obtained a BE degree in electrical engineering from North China University of Technology in 1998 and a PhD degree from University of New England in 2008. He has been with the Commonwealth Science and Industry Organisation (CSIRO) as a research scientist since 2008. His research focuses on machine learning and computational statistics for big data.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Feng Li, Chuanrong Li, Lingli Tang, and Yi Guo "Elastic registration for airborne multispectral line scanners," Journal of Applied Remote Sensing 8(1), 083614 (13 June 2014). https://doi.org/10.1117/1.JRS.8.083614
Published: 13 June 2014
Lens.org Logo
CITATIONS
Cited by 4 scholarly publications.
Advertisement
Advertisement
KEYWORDS
Image registration

Scanners

Remote sensing

Cameras

Multispectral imaging

Lithium

Distortion

Back to Top