Open Access
9 August 2013 Fast, simple, and good pan-sharpening method
Author Affiliations +
Abstract
Pan-sharpening of optical remote sensing multispectral imagery aims to include spatial information from a high-resolution image (high frequencies) into a low-resolution image (low frequencies) while preserving spectral properties of a low-resolution image. From a signal processing view, a general fusion filtering framework (GFF) can be formulated, which is very well suitable for a fusion of multiresolution and multisensor data such as optical-optical and optical-radar imagery. To reduce computation time, a simple and fast variant of GFF-high-pass filtering method (HPFM)—is proposed, which performs filtering in signal domain and thus avoids time-consuming FFT computations. A new joint quality measure based on the combination of two spectral and spatial measures was proposed for quality assessment by a proper normalization of the ranges of variables. Quality and speed of six pan-sharpening methods—component substitution (CS), Gram-Schmidt (GS) sharpening, Ehlers fusion, Amélioration de la Résolution Spatiale par Injection de Structures, GFF, and HPFM—were evaluated for WorldView-2 satellite remote sensing data. Experiments showed that the HPFM method outperforms all the fusion methods used in this study, even its parentage method GFF. Moreover, it is more than four times faster than GFF method and competitive with CS and GS methods in speed.

1.

Introduction

Multiresolution image fusion also known as pan-sharpening aims to include spatial information from a high-resolution image, e.g., panchromatic or synthetic aperture radar (SAR) image, into a low-resolution image, e.g., multispectral or hyperspectral image, while preserving spectral properties of a low-resolution image. A large number of algorithms and methods to solve this problem were introduced during the last two decades, which can be divided into two large groups based on a linear spectral transformation followed by a component substitution (CS)1 and a spatial frequency decomposition usually performed by means of high-pass filtering (HPF)2,3 or multiresolution analysis.4,5,6 Sometimes it is quite difficult to orient between all these methods though some classification attempts were already performed.68 We propose to look at these methods from a signal processing view. This type of analysis allowed us to recognize similarities and differences of various methods quite easily and thus perform a systematic classification of most known multiresolution image fusion approaches and methods.9 Moreover, this analysis resulted in a general fusion filtering framework (GFF) for a multiresolution image fusion. Sometimes state-of-the-art methods are quite computer time consuming, thus restricting their usage in praxis. To reduce computation time, a simple and fast variant of the GFF method further called a high-pass filtering method (HPFM) is proposed in this paper. It performs filtering in signal domain and thus avoids time-consuming FFT computations.

In parallel to pan-sharpening methods development, many attempts were undertaken to assess quantitatively their quality usually using measures originating from image processing such as mean square error, cross-correlation (CC), structural similarity image (SSIM) index,10 Wald’s protocol,11 and finally recently proposed joint measures: product of spectral measure based on SSIM and spatial measure based on CC12 and quality with no reference (QNR) measure.13 Some comparisons are presented in Refs. 14 and 15. Recently a statistical evaluation of most popular pan-sharpening quality assessment measures was performed in Refs. 16, 17, and 18. To measure two different image properties such as spectral and spatial quality, at least two measures are needed, which makes the task of ranking different fusion methods not easy. Following these results, a new joint quality measure (JQM) based both on spectral and spatial quality measures was proposed. This measure was enhanced by proposing a practical normalization of measures and successfully used for a quantitative fusion quality assessment in this paper.

The paper is organized in the following way. First, the four pan-sharpening methods—GFF, HPFM, CS, and Ehlers fusion method—are introduced. Then, a new JQM based on both spectral and spatial quality measures is described. Finally, experiments with a very-high-resolution satellite optical remote sensing WorldView-2 (WV-2) data are performed followed by discussion and conclusions.

2.

Methods

In this section the following four pan-sharpening methods are described: GFF, HPFM, CS, and Ehlers fusion method.

2.1.

General Fusion Filtering

Here the GFF method is shortly introduced in order to better understand the rationale behind a new image fusion method introduced in Sec. 2.2. For detailed description of the GFF method, see Ref. 9.

Let us denote by msk a low-resolution image, which can be a multispectral/hyperspectral or any other image, with the number of bands equal to k=1,,n(n{1,2,}) and by pan a high-resolution image, e.g., panchromatic band, intensity image of synthetic aperture radar (SAR), or any other image. A lot of existing multiresolution methods or algorithms can be seen as an implementation of a general fusion framework:

  • Low-resolution image interpolation msik=I(msk);

  • Fusion msfk=F(msik,pan);

  • Histogram matching msfk=M(msfk,msk).

Each band is processed independently. Further indices are omitted intentionally for the sake of clarity. First and third steps can be included in the fusion step depending on the method. Usually, I is a bilinear (BIL) or cubic convolution (CUB) interpolation and F is a linear or other function of images. In the following we formulate a GFF fusion method including interpolation and fusion in one step.

In order to preserve spectral properties of a low-resolution image ms, one should add only high-frequency information from a high-resolution image pan, thus preventing mixing of low-frequency information of pan with a low-resolution image. The natural way to do it is in a spectral or Fourier domain (signal processing view).

First, both images are transformed into spectral/Fourier space MS=FFT(ms) and PAN=FFT(pan). Then, high-frequency components are extracted from PAN (solid line, Fig. 1) and added to zero padded spectrum of MS (dotted line, Fig. 1). The formula is written as

Eq. (1)

MSF=ZP(W·MS)+PAN·HPF,
where ZP stands for zero padding, W(f)=α+(1α)·cos(2πf/PBWLR) is Hamming window with α=0.54 (aliasing/ringing avoidance), PBWLR is processing bandwidth of low-resolution image, and HPF is high-pass filter. Cutoff frequency of HPF allows us to control the amount of details added to a low-resolution image. Equivalently we can rewrite Eq. (1) for a low-pass filter (LPF) as

Eq. (2)

MSF=ZP(W·MS)+PAN·(1LPF).

Fig. 1

Addition of spectra of high-resolution (HR, solid line) and low-resolution (LR, dotted line) images. PBW stands for processing bandwidth, f for frequency, and fcutoff_HR for cutoff frequency of a high-pass filter.

JARS_7_1_073526_f001.png

Finally, the inverse Fourier transform delivers a fused image with an enhanced spatial resolution

Eq. (3)

msf=FFT1(MSF).

Thus GFF method performs image fusion in Fourier domain. Due to known equivalence formula in signal and spectral domains and assuming that interpolation is performed in a signal domain, we can rewrite Eqs. (1) to (3) as

Eq. (4)

msf=msi+pan*hpf=msi+panpan*lpf.

Equation (4) defines a fusion function F introduced above and helps to better understand the relation of the proposed method to already known methods.9 Here we have to note that Eq. (4) is not the same as GFF due to different interpolation methods used.

GFF method was first introduced in Ref. 9, where its similarity and differences to a high-pass filtering method (Ref. 2) is discussed.

So the only parameter to be selected is a cutoff frequency parameter fcutoff_HR controlling the amount of filtering. In this paper a Gaussian filter is used.

Eq. (5)

LPF=exp[12(ffcutoff_HR)2].

Of course, any other filter, e.g., Butterworth, can be used. Our experience showed no significant influence of a filter type on the fusion quality. Thus the GFF method depends only on the filtering parameter fcutoff_HR. We will see in Sec. 4.4 that the optimal parameter for fcutoff_HR is 0.15 for WV-2 satellite data, which is well supported by other studies (e.g., Ref. 19) and existing experience. Further studies on more data are planned. Moreover, we have to note that the band-dependent filtering parameter can be easily implemented and can increase the quality of pan-sharpening as already known from other studies.20

2.2.

High-Pass Filtering Method

As seen from Eq. (4), it is possible to avoid Fourier transform in the GFF method in order to reduce computation time. Thus a simple and fast variant of the GFF method, further called HPFM, can be derived as follows:

  • 1. Instead of zero-padding in spectral domain, an interpolation of multispectral image in signal domain can be performed.

  • 2. High-pass filter HPF or low-pass filter LPF with desired properties can be built up in a spectral domain and then transformed into signal domain hpf or lpf, respectively.

  • 3. Equation (4) is applied for image fusion in signal domain using convolution with a designed filter.

  • 4. Finally histogram matching is performed.

Proposed method differs from a method introduced in Ref. 2 in the way the filter is constructed and histogram matching performed. Usually a simple boxcar filter is used in signal domain, thus making a fine selection of amount of filtering difficult. For the HPFM the same filters as for GFF method, e.g., Eq. (5), can be used. Here we have to note that due to addition/subtraction operations in Eq. (4), the HPFM can be interpreted as an additive model. Multiplicative model can be written as

Eq. (6)

msf=msipan*lpfpan.

Thus the HPFM method depends on the filtering parameter fcutoff_HR, model, and interpolation method.

2.3.

Component Substitution

CS method is one of the simplest and, maybe, oldest image fusion methods. Here a short description following the recent enhancement of intensity hue saturation transformation method1 is given. Under the assumption that msi and pan are highly correlated, one can calculate intensity or mean as

Eq. (7)

i=1nk=1nmsik,
where n is the number of multispectral bands.

Now CS fusion under the assumption of Eq. (7) can be written as

Eq. (8)

msf=msii+pan.

Similarly as for the HPFM, this way of fusion can be seen as an additive model. Multiplicative model can be written as

Eq. (9)

msf=msiipan.

Thus a selection of model and interpolation method is required for this method.

2.4.

Ehlers Fusion

For a detailed description of the Ehlers fusion method, see Refs. 3 and 19. Its relation to the GFF method is discussed in Ref. 9. The Ehlers method depends on three parameters: interpolation method (cubic convolution was used as recommended by the authors), cutoff frequency fcutoff_HR for high-pass filtering of panchromatic image, and cutoff frequency fcutoff_I for low-pass filtering of intensity calculated according to Eq. (7). For Ehlers fusion in this paper again Gaussian filter [Eq. (5)] was used for both types of filtering. Finally, we have to note that our implementation of the method further called EhlersX in Xdibias software environment was used. The filtering for this method is implemented in a spectral domain. Validation of our implementation with the original Ehlers software implementation in MATLAB resulted in comparable results. As for other filtering methods GFF and HPFM, the optimal parameters for fcutoff_HR and fcutoff_I are 0.15 for WV-2 data, which coincides well with the experience of the authors of the method.

3.

Data

WV-2 satellite remote sensing data over the city of Munich in south Germany were used in our experiments. For scene details see Table 1.

Table 1

Scene parameters for WorldView-2 data over Munich city.

Parameter
Image dateJuly 12, 2010
Image time (local)10:30:17
ModePAN+MS
Look angle5.2 deg left
ProductL2A
Resolution PAN (m)0.5
Resolution MS (m)2.0

4.

Quality Measures

The quality of pan-sharpening is usually measured by spectral and/or spatial quality measures to cover both attributes of a processing result. Measures calculated for the whole image are called global. Window-based measures are calculated in selected areas or, e.g., using sliding window and can distinguish image parts with a different quality. The latter measures are outside the scope of this paper.

4.1.

Spectral Quality

Many spectral quality measures already have been proposed in the literature, e.g., Refs. 5 and 15. Recent comparison17,18 showed that the correlation (CORR) between original spectral bands and corresponding low-pass filtered and subsampled pan-sharpened bands is one of the best. It allows us to measure a spectral quality or preservation of a pan-sharpening method for individual bands or by averaging for all bands

Eq. (10)

CORR=1nkCC(msk,(fmsk*lpf)).

It has high values (optimal value is 1) for a good spectral preservation and low values for low spectral characteristics preservation. This measure alone is not able to assess the quality of fusion result, because it is calculated only in reduced image resolution/scale.

4.2.

Spatial Quality

The same investigation17,18 showed a preference of SSIM between original panchromatic band and pan-sharpened bands for a spatial quality assessment.

Eq. (11)

SSIM=1nkSSIM(pan,fmsk).

It exhibits high values (optimal value 1) for high spatial quality and low values for low spatial quality. Here we have to note that due to different width of spectral spectra of multispectral and panchromatic data, the CC as proposed in Ref. 12 may not be sufficient because of possible mean and standard deviation differences. SSIM allows us to account for such differences much better.

4.3.

Joint Quality Measure

In an ideal case, pan-sharpening method should exhibit both high spectral and spatial quality measure values. But it is not possible practically, because, e.g., for GFF method (also valid for other filtering methods) different parameters (amount of filtering) lead to different image qualities. Thus a larger high-pass filtering parameter value will lead to a higher spectral quality at the same time reducing spatial quality and vice versa [see Fig 2(b)]. None of the known separate quality measures can fulfill this requirement as a sole measure.14 Thus, a JQM could be helpful to achieve optimal parameter selection or best trade-off between spectral and spatial quality or find the best method for a particular application.

Fig. 2

Joint quality measure JQM (a) and separate quality measures (b) CORR and SSIM of HPFM pan-sharpening method with filtering parameters ranging from 0.05 to 0.7 for WorldView-2 Munich data.

JARS_7_1_073526_f002.png

One could think of a simple average or product12 of both measures: one for spectral and another one for spatial (in this particular case SSIM for spectral quality and CC for spatial quality). In such a way derived quality measure can be easily biased due to different range values of the separate measures. Moreover, CC for spatial quality can be insufficient for data exhibiting different spectral properties as already stated in Sec. 4.2.

In this paper we propose a new JQM that is based on CORR for spectral quality10 and SSIM for spatial quality,11 measures resulting from discussion in Secs. 4.1 and 4.2.

Due to different ranges of two measures (SSIM is usually lower than CORR as can be seen in Fig 2(b)] we propose here to normalize one of the measures before averaging (producing a joint measure) using a linear scaling transform

Eq. (12)

SSIMnorm=SSIMSSIMminSSIMmaxSSIMmin·(CORRmaxCORRmin)+CORRmin,
where SSIMmin stands for minimum of all SSIM values. For example these values can be calculated from the results of HPFM method using different filtering parameters (see Sec. 4.4). Similarly other minimum and maximum values are defined. We have to note that mean and standard deviation values (standardization) can be used for normalization instead of extreme range values with a risk of errors due to insufficient size of number of samples.

Now the averaging of corresponding spectral and normalized spatial measures

Eq. (13)

JQM=(CORR+SSIMnorm)/2
delivers a much more meaningful JQM, which is more suitable for parameter selection or comparison of different methods.

4.4.

How to Normalize Quality Measures?

For the normalization as proposed in Eq. (12), we need to define four parameters: minimum and maximum values of two variables CORR and SSIM. This can be derived in different ways, e.g., from existing experience (subjective) or experiments with a lot of different methods exhibiting different fusion quality, and thus covering the whole range of spectral and spatial qualities. We follow the latter approach with the following optimizations. Due to diversity of content in an image (remote sensing scene) we propose a data-driven estimation of extreme values. To reduce computation time, we propose to use a single method: a filtering-based method, e.g., HPFM, with two different parameters producing best and worst qualities, e.g., small parameter value (0.05) for high spatial and at the same time low spectral quality and large parameter value (0.7) for low spatial and at the same time high spectral quality [see Fig. 2(b)]. Thus only two runs of a very fast HPFM method deliver the required four parameters. JQM around 0.15 suggests an optimal parameter value for HPFM [Fig. 2(a)].

4.5.

How to Use JQM for Comparison of Different Methods?

JQM introduced in Eq. (13) requires knowledge of normalized SSIMnorm, which is not available for all methods. As shown in the previous section, it is even not necessary because these extreme values can be derived using a reference method, e.g., HPFM. Then we can rewrite Eq. (13) using Eq. (12)

Eq. (14)

JQM=(CORR+A·SSIM+B)/2,
where

Eq. (15)

A=CORRmaxCORRminSSIMmaxSSIMmin,

Eq. (16)

B=SSIMmin·(CORRmaxCORRmin)SSIMmaxSSIMmin+CORRmin.

Applying extreme value ranges as proposed above (Sec. 4.4) results in the following A and B values (see Table 2), which are data dependent but are very easy and fast to derive. Thus proposed JQM [Eq. (14)] can be used for estimating quality of any pan-sharpening method on these data.

Table 2

Estimated values of A and B according to Eqs. (15) and (16) from the proposed extreme range values used for calculation of JQM.

ParameterValue
A0.6786
B0.42
CORRmin0.9508
CORRmax1.0
SSIMmin0.7822
SSIMmax0.8547

Additional margin of 10% added to these extreme values (avoiding out-of-range values) ensures the appropriate comparison or quality assessment of other pan-sharpening methods. Thus the proposed normalization of quality measures is data dependent, which means it should be performed individually for each image/scene. But this is not a really great drawback because the normalization should be estimated only once per scene and a fast fusion method such as HPFM with different parameter settings can be used.

Here we have to note that there exists one more JQM called QNR (Ref. 13), which is not included in this study, but is one of the topics of the next paper.

5.

Experiments

In this section we shall compare different pan-sharpening methods using JQM as proposed in the previous section. Two more known fusion methods were added to the comparison: Amélioration de la résolution spatiale par injection de structures (ARSIS) (Ref. 21) variant, À Trous wavelet transform model for wavelet fusion4 (implementation of Ref. 22) and Gram-Schmidt (GS) spectral sharpening implemented in ENVI Software with an averaging method for low-resolution file calculation and bilinear resampling. Results of quality assessment for various methods are presented in the next two sections by using values of A and B from Table 2. Here we have to note that for the quality assessment only bands spectrally overlapping with panchromatic band are used to assure physically justified results.8 Thus, the following three bands were excluded from further analysis: coastal, NIR1, and NIR2.

5.1.

Comparison of Interpolation Methods

In many papers, in addition to the assessment of fusion methods, quality measures for only interpolation (resampling of low-resolution image to high-resolution image) are also referred. Comparison of the four most popular (fast enough for operational applications) interpolation methods is presented in Table 3 and visualized in Fig. 3.

Table 3

Quality measures of various interpolation methods for WorldView-2 Munich data.

Interpolation methodCORRSSIMJQM
Nearest neighbor0.97020.78600.9618
Zero padding0.97810.75420.9550
Bilinear0.99480.76590.9673
Cubic convolution0.99340.74200.9585

Fig. 3

Joint quality measure JQM (a) and separate quality measures (b) CORR and SSIM of different interpolation methods: (1) nearest neighbor, (2) zero padding, (3) bilinear interpolation, and (4) cubic interpolation for WorldView-2 Munich data.

JARS_7_1_073526_f003.png

BIL and CUB interpolation methods exhibit the best spectral quality (CORR), which is supported by existing experience. More surprising are results of spatial quality (SSIM). Here the best is nearest neighbor followed by BIL. JQM suggests BIL as interpolation method for pan-sharpening, which agrees well with existing experience. Here we have to note that SSIM is designed for quality assessment of fusion methods whereas interpolation method is not a fusion method and contains no information from panchromatic band. Thus the results of SSIM and JQM for interpolation methods should be treated cautiously and cannot be compared directly with the results of the next section.

5.2.

Comparison of Pan-Sharpening Methods

Comparison of quality and speed of six pan-sharpening methods (some of them with different parameters) is presented in Table 4 and visualized in Fig. 4.

Table 4

Quality measures and computation time of various pan-sharpening methods for WorldView-2 Munich data on Intel Core 2 Quad CPU Q9450 at 2.66 GHz.

MethodModelInterpolation methodfcutoff_HRfcutoff_ICORRSSIMJQMTime (s)
1 GFFZero padding0.150.97820.83620.982885.1
2 HPFMAdditiveBilinear0.150.98660.83370.986219.8
3 HPFMAdditiveCubic convolution0.150.98730.83180.985922.1
4 HPFMMultiplicativeBilinear0.150.98720.83590.987220.3
5 HPFMMultiplicativeCubic convolution0.150.98780.83460.987122.7
6 HPFMAdditiveBilinear0.050.96080.84470.977019.8
7 HPFMAdditiveBilinear0.70.99560.79220.976619.8
8 CSAdditiveBilinear0.94060.82070.958819.3
9 CSMultiplicativeBilinear0.93580.83100.959819.7
10 EhlersXCubic convolution0.150.150.94500.84910.9706160.4
11 ARSIS0.95010.86630.9790134.8
12 GSBilinear0.94530.81920.960618.5

Fig. 4

Joint quality measure JQM (a) and separate quality measures (b) CORR and SSIM of 12 pan-sharpening methods for WorldView-2 Munich data (see method number in Table 4).

JARS_7_1_073526_f004.png

We see that HPFM method [all four parameter settings (methods 2 to 5)] outperforms its parentage GFF method (method 1). This is because zero padding interpolation smoothens multispectral image to a much greater extent than, e.g., bilinear interpolation (compare CORR values in Table 3). Multiplicative model is better than additive model, whereas the difference between BIL and CUB interpolation methods is negligible. Optimal filtering parameter 0.15 [see Fig. 2(a)] was used for all filtering methods in this paper. Existing experience supports this value even for other data and sensors. Following two parameters 0.05 (method 6) and 0.7 (method 7) of HPFM were used to derive normalizing constants A and B, because the results exhibit extreme spectral and spatial qualities as seen in Fig. 4(b). Their JQM is comparable with that of ARSIS method known for its high spatial quality. The lowest JQM as expected exhibit both CS methods and GS sharpening. Ehlers fusion finds its place somewhere between this group and ARSIS. These observations are fully supported by existing experience and visual analysis (Fig. 5).

Fig. 5

Multispectral bilinear interpolated (bands: 5, 3, 2) and GFF, HPFM, CS, Ehlers, and ARSIS pan-sharpened images of WorldView-2 Munich data.

JARS_7_1_073526_f005.png

Additionally computation time of the methods is presented in Table 4 for multispectral image size 1024×1024, panchromatic image size 4096×4096, and for all eight bands of WV-2 Munich data. We see that the proposed HPFM pan-sharpening method is more than four times faster than the parentage GFF fusion method. Moreover, the speed of the proposed method HPFM is comparable with that of classical CS and GS methods. Ehlers and ARSIS fusion methods are about two times slower than GFF and thus are less suitable for operational applications.

6.

Conclusions

A simplified version of a GFF method—a fast, simple, and good HPFM—is introduced with a potential for operational remote sensing applications. It performs filtering in signal domain, thus avoiding time-consuming FFT computations.

A new JQM based on both spectral and spatial quality measures (carefully selected from previous studies) is used to assess the fusion quality of six pan-sharpening methods—GFF, HPFM (with different parameter settings), CS, GS, Ehlers fusion, and ARSIS pan-sharpening methods—on a very high-resolution WV-2 satellite optical remote sensing data. Only spectral bands whose spectral spectrum is overlapping with a spectrum of panchromatic band were used for quality assessment to ensure physically consistent evaluation. The proposed JQM allows a comfortable ranking of different methods using a sole quality measure.

Experiments showed that the HPFM pan-sharpening method exhibits the best fusion quality among several popular methods tested (even better than its parentage method GFF) and at the same time more than four times lower computation time than GFF method. Thus HPFM method, being competitive in speed with known fast methods such as CS and GS but exhibiting a much higher quality, is a good candidate for operational applications. A new quality measure JQM allowed the correct ranking of different pan-sharpening methods, which is consistent with existing experience and visual analysis, thus claiming to be a suitable quality measure for selecting the parameters of a particular fusion method and comparison of different methods.

Acknowledgments

We would like to thank DigitalGlobe and European Space Imaging for the collection and provision of WorldView-2 scene over the Munich city.

References

1. 

T. M. Tuet al., “A new look at IHS-like image fusion methods,” Inform. Fusion, 2 (3), 177 –186 (2001). http://dx.doi.org/10.1016/S1566-2535(01)00036-7 1566-2535 Google Scholar

2. 

J. Hillet al., “A local correlation approach for the fusion of remote sensing data with different spatial resolution in forestry applications,” in Proc. of Int. Archives of Photogrammetry and Remote Sensing, 167 –174 (1999). Google Scholar

3. 

S. KlonusM. Ehlers, “Image fusion using the Ehlers spectral characteristics preservation algorithm,” GIsci. Rem. Sens., 44 (2), 93 –116 (2007). http://dx.doi.org/10.2747/1548-1603.44.2.93 1548-1603 Google Scholar

4. 

B. Aiazziet al., “Context-driven fusion of high spatial and spectral resolution images based on oversampled multiresolution analysis,” IEEE Trans. Geosci. Rem. Sens., 40 (10), 2300 –2312 (2002). http://dx.doi.org/10.1109/TGRS.2002.803623 IGRSD2 0196-2892 Google Scholar

5. 

L. Alparoneet al., “Comparison of pansharpening algorithms: outcome of the 2006 GRS-S data-fusion contest,” IEEE Trans. Geosci. Rem. Sens., 45 (10), 3012 –3021 (2007). http://dx.doi.org/10.1109/TGRS.2007.904923 IGRSD2 0196-2892 Google Scholar

6. 

B. Aiazziet al., “A comparison between global and context-adaptive pansharpening of multispectral images,” IEEE Geosci. Rem. Sens. Lett., 6 (2), 302 –306 (2009). http://dx.doi.org/10.1109/LGRS.2008.2012003 IGRSBY 1545-598X Google Scholar

7. 

Z. Wanget al., “A comparative analysis of image fusion methods,” IEEE Trans. Geosci. Rem. Sens., 43 (6), 1391 –1402 (2005). http://dx.doi.org/10.1109/TGRS.2005.846874 IGRSD2 0196-2892 Google Scholar

8. 

C. Thomaset al., “Synthesis of multispectral images to high spatial resolution: a critical review of fusion methods based on remote sensing physics,” IEEE Trans. Geosci. Remote Sens., 46 (5), 1301 –1312 (2008). http://dx.doi.org/10.1109/TGRS.2007.912448 IGRSD2 0196-2892 Google Scholar

9. 

G. PalubinskasP. Reinartz, “Multi-resolution, multi-sensor image fusion: general fusion framework,” in Proc. of Joint Urban Remote Sensing Event, 313 –316 (2011). Google Scholar

10. 

Z. Wanget al., “Image quality assessment: from error visibility to structural similarity,” IEEE Trans. Image Process., 13 (4), 600 –612 (2004). http://dx.doi.org/10.1109/TIP.2003.819861 IIPRE4 1057-7149 Google Scholar

11. 

L. WaldT. RanchinM. Mangolini, “Fusion of satellite images of different spatial resolutions: assessing the quality of resulting images,” Photogramm. Eng. Rem. Sens., 63 (6), 691 –699 (1997). PGMEA9 0099-1112 Google Scholar

12. 

C. Padwicket al., “WorldView-2 pan-sharpening,” in Proc. American Society for Photogrammetry and Remote Sensing, 13 (2010). Google Scholar

13. 

L. Alparoneet al., “Multispectral and panchromatic data fusion assessment without reference,” Photogramm. Eng. Rem. Sens., 74 (2), 193 –200 (2008). PGMEA9 0099-1112 Google Scholar

14. 

S. LiZ. LiJ. Gong, “Multivariate statistical analysis of measures for assessing the quality of image fusion,” Int. J. Image Data Fusion, 1 (1), 47 –66 (2010). http://dx.doi.org/10.1080/19479830903562009 Google Scholar

15. 

Q. Duet al., “On the performance evaluation of pan-sharpening techniques,” IEEE Geosci. Rem. Sens. Lett., 4 (4), 518 –522 (2007). http://dx.doi.org/10.1109/LGRS.2007.896328 IGRSBY 1545-598X Google Scholar

16. 

A. MakarauG. PalubinskasP. Reinartz, “Multiresolution image fusion: phase congruency for spatial consistency assessment,” in Proc. of ISPRS Technical Commision VII Symposium–100 Years ISPRS–Advancing Remote Sensing Science, 383 –388 (2010). Google Scholar

17. 

A. MakarauG. PalubinskasP. Reinartz, “Selection of numerical measures for pan-sharpening assessment,” in Proc. Int. Geoscience and Remote Sensing Symp., 2264 –2267 (2012). Google Scholar

18. 

A. MakarauG. PalubinskasP. Reinartz, “Analysis and selection of pan-sharpening assessment measures,” J. Appl. Rem. Sens., 6 (1), 063548 (2012). http://dx.doi.org/10.1117/1.JRS.6.063548 1931-3195 Google Scholar

19. 

S. Klonus, “Optimierung und Auswirkungen von ikonischen Bildfusionsverfahren zur Verbesserung von fernerkundlichen Auswerteverfahren,” Universität Osnabrück, (2011). Google Scholar

20. 

B. Aiazziet al., “MTF-tailored multiscale fusion of high-resolution MS and Pan imagery,” Photogramm. Eng. Rem. Sens., 72 (5), 591 –596 (2006). PGMEA9 0099-1112 Google Scholar

21. 

T. RanchinL. Wald, “Fusion of high spatial and spectral resolution images: the ARSIS concept and its implementation,” Photogramm. Eng. Rem. Sens., 66 (1), 49 –61 (2000). PGMEA9 0099-1112 Google Scholar

22. 

M. J. Canty, “IDL extensions for ENVI,” (2013) http://mcanty.homepage.t-online.de/software.html Mar ). 2013). Google Scholar

Biography

JARS_7_1_073526_d001.png

Gintautas Palubinskas received MS and PhD degrees in mathematics from Vilnius University, Vilnius, Lithuania, in 1981, and Institute of Mathematics and Informatics (IMI), Vilnius, Lithuania, in 1991, respectively. His doctoral dissertation was on spatial image recognition. He was a research scientist at the IMI from 1981 to 1997. From 1993 to 1997, he was a visiting research scientist at German Remote Sensing Data Center, DLR; the Department of Geography, Swansea University, Wales, U.K.; Institute of Navigation, Stuttgart University, Germany; Max-Planck-Institute of Cognitive Neuroscience, Leipzig, Germany. Since 1997, he has been a research scientist at German Remote Sensing Data Center (later Remote Sensing Technology Institute), German Aerospace Center DLR. He is the author/coauthor of about 40 papers published in peer-reviewed journals. Current interests are in image processing, image fusion, classification, change detection, traffic monitoring, data fusion for optical and SAR remote sensing applications.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Gintautas Palubinskas "Fast, simple, and good pan-sharpening method," Journal of Applied Remote Sensing 7(1), 073526 (9 August 2013). https://doi.org/10.1117/1.JRS.7.073526
Published: 9 August 2013
Lens.org Logo
CITATIONS
Cited by 34 scholarly publications.
Advertisement
Advertisement
KEYWORDS
Image fusion

Quality measurement

Image quality

Linear filtering

Image filtering

Electronic filtering

Filtering (signal processing)

Back to Top