1 September 2005 Robust global motion estimation method for aerial imagery
Author Affiliations +
Abstract
A robust global motion model estimation method is proposed by fitting the optical flow field for aerial imagery. Considering the outlier-sensitive defect of traditional least-squares regression technique, we put forward a simple optical flow valuation mechanism to choose a small set of reliable flows for fitting. Since flow outliers that are unfit for fitting are almost completely removed, the final global motion estimation result is highly improved. Experimental results show the robustness and applicability of our method.
Xu and An: Robust global motion estimation method for aerial imagery

1.

Introduction

In aerial imagery, the estimation problem of global motion referred to that as caused by the airborne moving observer is usually a key part in many aerial surveillance applications,1 such as image stabilization, motion detection, mosaic, coding, etc. Generally, the global motion of aerial imagery can be represented by a 2-D parametric model such as an affine or planar model and can be estimated directly by a parametric optimization process.2 Furthermore, block-based methods, feature-based algorithms, and frequency domain methods have also been widely used for global motion estimation.3, 4 However, many of these methods are either time-consuming or not robust enough to record uncertainties or outliers. RANSAC has been an ideal solution for elimination of outliers, but the random nature of the algorithm makes direct use of RANSAC inefficient.5

Since global motion is usually dominant in aerial imagery as compared to small independent or local motions and other distracting ones, it is natural to obtain the global motion model in an indirect way by fitting the optical flow field of the aerial imagery. However, due to the outlier-sensitive defect of the standard linear regression technique used in fitting, either uncertainties6 or independent motions7 of the optical flow field would ruin the final fitting result. Therefore, instead of fitting the whole optical flow field, we prefer to choose a small set of reliable flow components for fitting so that the negative effect of flow outliers can be greatly reduced.

2.

Optical Flow Estimation and Motion Model Fitting

Our optical flow computation is based on two well-known assumptions: brightness constancy and flow smoothness constraint. For computational saving, we conduct our algorithm under a coarse-to-fine hierarchical framework as proposed in Ref. 2. The multiscale implementation allows for both computation efficiency and estimation of large motions. In addition, if the algorithm proceeds to a middle layer of the image pyramid, e.g., layer n2+1 of an n -layered pyramid, distracting motions and noise can also be filtered somehow and would benefit to global motion model fitting.

Now we discuss how to use the estimated optical flow field to fit a global motion model. Assume that the motion model has the form:

1

u(x,y)=Φau,v(x,y)=Φav
where (u,v) is the motion vector. As to an affine motion model, we have Φ=(x,y,1) , auT=(a1,a2,a3) , avT=(a4,a5,a6) . Using the least-squares regression technique, the motion model parameters can be derived as follows:

2

(au,av)=(ΦTΦ)1ΦT(u,v),(x,y)DI
where DI denotes the whole image plane.

While this approach provides a simple mechanism for global motion estimation, it is unfortunately of limited use because it is sensitive to outliers, which correspond to either uncertainties or independent motions of the optical flow field. Therefore, to get a more accurate motion model, outliers have to be removed, which leads to the following refined method.

3.

Optical Flow Valuation and Outlier Removal

To evaluate optical flows and remove flow outliers unfit for motion model fitting, first we divide the optical flow field into an array of non-overlapping regions and derive a set of motion hypotheses by fitting each region separately. Suppose auiT , aviT are the i ’th hypothesis, then for each region we have:

3

(aui,avi)=(ΦTΦ)1ΦT(ui,vi),(x,y)Ri
where each region is indexed with the variable i and the summation is applied within each region Ri . Obviously, many of the hypotheses will be incorrect because of the existence of optical flow estimation inaccuracies, independent motions, and other non-interesting motions. The reliability of a hypothesis is indicated by its residual error σi2 , which can be calculated as follows:

4

σi2=(ViVai2)Ni,(x,y)Ri
where Vi=(ui,vi) is the estimated flow vector, Vai is the flow vector derived from the motion hypothesis, and Ni is the number of pixels in the analysis region. Then, σi2 can be used as a criterion to valuate whether a flow region is fit for motion model fitting. Note to reflect the distracting effect of independent motions, at least one dimension size of the dividing region should be set a little larger than that of the possible independent motion area. Given a prescribed threshold, flow regions with greater residual errors can then be rejected as outliers.

4.

Selection of Optical Flows for Fitting

A troublesome problem during optical flow valuation and outlier removal is how to choose an appropriate threshold, which varies with different region size. In fact, if the region size is set larger, we tend to choose a higher threshold since the residual error defined in Eq. 4 will increase, and vice versa. Empirically, as to the aerial imagery with a dominant global motion, the selection of a small proportion (e.g., 5%) of reliable optical flows would be sufficient to arrive at an accurate motion model. Therefore, instead of trying efforts to find a proper threshold for outlier removal, we prefer to select a small set of reliable flows with relatively small residual errors so that the flow outliers can be excluded. Since the choice of proportion criterion is usually constant with respect to different aerial imagery, the bothersome threshold selection work can be avoided. The flows valuation and selection algorithm is outlined below:

  • (i) Set an initial threshold Ω=Ω0 (usually assigned a very small value), and a searching step ΔΩ . The proportion criterion is set to β .

  • (ii) Divide the optical flow field into non-overlapping regions and compute the residual errors, i.e., σi2 , i=1,2M , according to Eq. 4, where M is the number of regions.

  • (iii) Find flow regions that satisfy σi2<Ω , and the number of selected regions is denoted by N .

  • (iv) If NMβ , stop searching and Ω is the ultimate threshold, otherwise Ω=Ω+ΔΩ and go to step (iii).

Once the threshold Ω is determined, reliable flow regions are extracted while flow outliers are rejected in the meanwhile. Finally, by fitting the selected optical flows again, a more accurate global motion model can be obtained. As only a small amount of flows participates in fitting, the algorithm efficiency is further improved.

5.

Experimental Results

We first test the performance of our algorithm by registering the two aerial images shown in Fig. 1. Obviously, there are both large scale and rotation changes between Fig. 1a and Fig. 1b. Figure 1c and Fig. 1d demonstrate two cases of optical flow selection results (denoted by areas outlined in black) using the proposed algorithm according to two different proportion criteria. The optical flow field is calculated under a five-layered Gauss pyramid and the computation is proceeded until the fourth layer, which means the size of the optical flow field is actually 128×128 . The same region size 16×16 is set in both cases and the proportion criteria are set to 5% and 30%, respectively. Accordingly, the final threshold is determined as 0.0965 in Fig. 1c and three flow regions are selected, while 19 regions are selected in Fig. 1d with threshold determined as 1.5084. By fitting the selected optical flows to an affine model, we get auT=(1.2791,0.2644,8.2940) , avT=(0.2769,1.2855,78.5839) in the first case and auT=(1.1744,0.1175,9.6717) , avT=(0.1739,1.1751,52.0925) in the latter. It is evident from the registration results shown in Fig. 1e and Fig. 1f that the global motion estimation result in Fig. 1e is much more accurate because fewer but more precise optical flows have been chosen in Fig. 1c for motion model fitting, while in Fig. 1d the involvement of more probably imprecise flows leads to an inaccurate fitting result in Fig. 1f.

Fig. 1

Example of aerial image registration. (a) The first image 256×256 . (b) The second image 256×256 . (c) Optical flows selection result 1. (d) Optical flows selection result 2. (e) Registration result 1. (f) Registration result 2.

090501_1_1.jpg

Figure 2 shows another example of our algorithm in independent motion detection. To detect the moving truck based on the two consecutive frames shown in Fig. 2a and Fig. 2b, the apparent background or global motion has to be compensated. For optical flows selection and outlier removal, each region size is set as 16×12 , the proportion criterion is 5%, and the final determined threshold is 0.0409 with five regions selected. It is clear from Fig. 2c that the regions containing independent motions with respect to the moving truck have been successfully excluded. In fact, the multiscale optical flow computation proceeds only to the third layer of a four-layered Gauss pyramid, indicating that the real size of the optical flow field in Fig. 2c is 160×120 . By fitting the selected optical flows to an affine motion model, we get auT=(1.0019,0.0026,7.6282) , avT=(0.0044,1.0002,5.5330) . Figure 2d shows the residual motion image8 after the global motion has been compensated, from which we see the moving truck can be easily detected.

Fig. 2

Example of independent motion detection. (a) The first frame 320×240 . (b) The second frame 320×240 . (c) Optical flows selection result. (d) Residual motion image after global motion compensation.

090501_1_2.jpg

6.

Conclusions

According to the characteristic of aerial imagery, a robust global motion estimation algorithm by fitting optical flow field is proposed in this letter. Since optical flow outliers are almost completely removed by choosing a small proportion of reliable flows for motion model fitting, global motion estimation accuracy and robustness have been highly increased. Experimental results on both aerial image registration and independent motion detection show the effectiveness of our algorithm.

References

1.  R. Kumar, H. Sawhney , “Aerial video surveillance and exploitation,” Proc. IEEE0018-9219 89(10), 1518–1539 (2001). Google Scholar

2.  J. Bergen, P. Anandan, K. Hanna, and R. Hingorani, “Hierarchical model-based motion estimation,” Proc. ECCV, pp. 237–252 (1992). Google Scholar

3.  E. E. Kang, I. Cohen, and G. Medioni, “Fast and robust 2D parametric image registration,” Tech. Report IRIS-03-421, Institute for Robotics and Intelligent Systems, University of Southern California (2003). Google Scholar

4.  S. Kumar, M. Biswas, and T. Q. Nguyen, “Global motion estimation in frequency and spatial domain,” IEEE Proc. ICASSP, pp. 333–336 (2004). Google Scholar

5.  A. Lacey, N. Pinitkarn, and N. Thacker, “An evaluation of the performance of RANSAC algorithms for stereo camera calibration,” Proc. BMVC, pp. 646–655 (2000). Google Scholar

6.  L. Gaucher and G. Medioni, “Accurate motion flow estimation with discontinuities,” IEEE Proc. ICCV, pp. 695–702 (1999). Google Scholar

7.  R. Pless, T. Brodsky, and Y. Aloimonos, “Independent motion: The importance of history,” IEEE Proc. CVPR 2, 92–97 (1999). Google Scholar

8.  M. Irańi, B. Rousso, and S. Peleg, “Computing occluding and transparent motions,” Int. J. Comput. Vis.0920-5691 10.1007/BF01420982 12(1), 5–16 (1994). Google Scholar

© (2005) Society of Photo-Optical Instrumentation Engineers (SPIE)
Dong Xu, Jinwen An, "Robust global motion estimation method for aerial imagery," Optical Engineering 44(9), 090501 (1 September 2005). https://doi.org/10.1117/1.2042479 . Submission:
JOURNAL ARTICLE
3 PAGES


SHARE
Back to Top