Open Access
8 October 2024 Alien species distribution mapping using satellite and drone fusion on river dikes along the Tone River, Japan
Mony Rith So, Shigehiro Yokota
Author Affiliations +
Abstract

Mapping native grasses and alien species on river dikes is crucial for vegetation management, which influences native species conservation and river dike maintenance. We aimed to improve alien species distribution mapping of Solidago altissima and Sorghum halepense on river dikes along Tone River, by fusing satellite and drone images. The methodology includes fusing pan-sharpened WorldView-3 satellites with drone images for one sample dike. The object-based approach segmented images into objects and extracted statistical data to form datasets. Regression models were constructed from the sample dike to predict fused satellite–drone datasets for other dikes. The random forest classification model was then applied to map the distribution of alien species. Our findings highlight the enhanced mapping accuracy achieved using a fused satellite–drone dataset approach. For Solidago altissima, the fused dataset reaches the highest overall accuracy (OA) of 98.39% and a kappa coefficient of 0.976. Likewise, for Sorghum halepense, the fused dataset achieved the highest OA of 97.78% and a kappa coefficient of 0.936, compared with that of pan-sharpened and original satellite datasets. These outcomes underscore the contribution of the fused satellite–drone imagery for alien species distribution mapping in the river dike environment.

1.

Introduction

River dikes are substantial river structures constructed to safeguard people and assets from flooding. Such dikes are typically managed by local river offices established across Japan by the Ministry of Land, Infrastructure, Transport and Tourism.1 The herbaceous vegetation growing on these dikes vitally prevents soil erosion,2 which could lead to dike failure and severe flooding with failure. Maintaining ideal vegetation conditions, such as the presence of native grass species and ensuring that the dikes are visible for safety checks, is essential. However, the spread of invasive alien species complicates dike vegetation management. Solidago altissima and Sorghum halepense are the dominant alien species in this study. S. altissima, the tall goldenrod, dominates several plant communities via allelopathy and reduces biodiversity.3 S. halepense competes with crops and grass species in cropland, grassland, and urban environments, increasing invasion risk.4 Accurately identifying areas on the dikes where native grasses and alien species grow would streamline dike vegetation management. Compared with traditional field surveys, which are geographically limited and require significant time and resources, remote sensing offers a cost-effective, large-scale, and long-term monitoring solution, systematically tracking species distribution over vast areas.5 However, classifying herbaceous vegetation on river dikes using remote sensing is complicated; it demands a high level of spatial resolution and accuracy to identify small, complex-shaped vegetation on the steep slopes of the dikes.6

Satellite remote sensing has become increasingly popular as a feasible alternative to map alien species. Unlike traditional methods, this approach can be applied to large and remote areas using satellite data and techniques.7 High spatial resolution multispectral sensors, such as IKONOS, WorldView-2/3, and Quickbird, advance broadband sensor technology for improved alien invasive plant (AIP) detection and mapping. These sensors have enhanced their ability to distinguish among different types of vegetation compared with the limited spatial resolution of multispectral sensors.8 For instance, Ngubane et al.9 reported a higher mapping accuracy (91.67%) for “Bracken” fern in Durban, South Africa, using the high spatial resolution WorldView-2, compared with the medium spatial resolution SPOT-5 (72.22%). The large pixel sizes of satellite remote sensing restrict their application to homogeneous landscapes, leading to the mixed pixel issue. The mixed pixel problem complicates invasive plant differentiation from a mixture of different plants and influences classification precision in remotely sensed images.10 Conflict occurs in labeling mixed pixels that encompass spectral responses of different land cover on the ground element, leading to confused classification. Therefore, an innovative method is required for large-scale mapping of invasive plants using satellite remote sensing. One technique for enhancing images involves the sharpening of multispectral (MS) bands with a panchromatic (pan) band. This pan-sharpening process aims to increase spatial resolution and reduce mixed pixels.11 For instance, Ai et al.12 evaluated five pan-sharpening algorithms to enhance AIP mapping accuracy. Their study demonstrated that pan-sharpening could improve AIP mapping; however, their results still included around 10% of accuracy misclassification errors. Therefore, future research should focus on integrating various data sources to enhance AIP mapping.

Drones or unmanned aerial vehicles (UAVs) have been used to identify AIPs1315 as these vehicles can capture high-resolution images at low altitudes. For example, Akandil et al.16 developed a method using UAVs equipped with multispectral sensors to accurately detect and assess giant goldenrod coverage. Their study yielded a kappa coefficient of 0.902 and an overall accuracy (OA) of 92.12%. These study findings highlight the effectiveness of UAV remote sensing in generating high-resolution maps with high accuracy. However, remote-sensing UAVs have limited flight operation, owing to legislation regarding air traffic and the safety and privacy of society.17 Considering the area coverage, satellite remote sensing has the advantage of being able to cover a larger area than drones. Thus, the combination of data from satellites and drones might allow for the acquisition of broad coverage from satellites and detailed information from drones. Additional data from drones might enhance wide-area satellite-based AIP mapping.18

The synergy between satellite and UAV data is essential to study the dynamics of Earth’s surfaces.19 Data fusion combines data from two or more sensors to create a new, comprehensive dataset. Although it can involve any type of sensor, this study concerns merging data from satellites and UAVs, to fully exploit the characteristics of each data source, resulting in a robust synergy between satellite and UAV data. Satellite and drone data are typically used separately, despite their complementary nature and strong potential synergy. Data fusion is a well-known technique for exploiting multisource synergy. However, in practice, the synergy between satellite and UAV data is specific, not well-established, and requires formalization.20 Moreover, a review by Royimani et al.8 highlighted several research gaps in long-term and large-scale AIP mapping. One such shortcoming was exploring the benefits of multisource datasets, or data fusion, for mapping AIPs. Data fusion or multisource datasets are increasingly used for mapping AIPs using remote sensing approaches. However, the use of data fusion for mapping AIPs has been dominated by the integration of multispectral satellite datasets with light detection and ranging (LiDAR) data, not with drone data. Hence, the full potential of data fusion between satellite and drone datasets for optimal AIP detection and mapping has not yet been fully explored.

This study aims to improve the mapping accuracy of the AIP, specifically, S. altissima and S. halepense, on river dikes along the Tone River using fused WorldView-3 satellite and drone imagery. The goal of this study is to explore the potential of the fusion data between high-resolution satellites and drones to accurately map the AIP on wide-scale river dikes to aid effective vegetation management.

2.

Materials and Methods

2.1.

Study Area

The river dike environment (Fig. 1) is in the middle stream of Tone River in Ibaraki prefecture, Japan. The environment contains nine river dikes. Dike 3, for example, is 1  km long with a width of 40 m and a height of 7 m.18 These dikes are crucial for protecting human lives and urban properties from potential flooding and overflow incidents from the Tone River. The structural maintenance of these dikes is vital, which requires efficient vegetation management along their slopes. The vegetation management on river dikes helps conserve native ecosystems and prevents soil erosion along the slopes of dikes. In addition, dikes are surrounded by paddy fields, farmlands, and other agricultural zones.18 Thus, it is essential to avoid AIP spreading from the dikes to the surrounding areas. Invasive plants pose a threat to crops and other agricultural products cultivated by local farmers.

Fig. 1

Study area of the river dike environment.

JARS_18_4_044505_f001.png

River dikes are predominantly covered by herbaceous vegetation including a variety of plant species. Among these are native grass species and alien species, of which S. altissima and S. halepense are the two dominant alien species18 (Fig. 2). S. altissima grows over 1 m high and has broad leaves that block sunlight from reaching native species, physically obstructing the visibility of the river dike.21 S. halepense has strong competitive abilities and allelopathic potential, affecting the growth and development of nearby plants, causing agricultural losses and negatively impacting natural biodiversity.22

Fig. 2

Target alien species. Solidago altissima (a). Sorghum halepense (b).

JARS_18_4_044505_f002.png

2.2.

Image Acquisition

Satellite imagery from WorldView-3, a high-resolution imaging and environmental monitoring satellite operated by Maxar Technologies (Westminster, Colorado, United States), was used in this study (Table 1). WorldView-3 satellite imagery was captured on August 22, 2018,18 which was closely aligned with the date of the drone-conducted aerial survey. This satellite imagery provides a high spatial resolution, with a 1.6-m resolution for eight spectral bands and a 0.4-m resolution for a panchromatic band. In this study, only panchromatic and red-green-blue (RGB) bands of satellite images were used, as they were required for the pan-sharpening and fusion processes.

Table 1

Spectral range of WorldView-3 imagery.

Spectral rangeBand numberBand nameSpectral band (nm)
Panchromatic band450 to 800
MS bands in visible near-infrared1Coastal blue400 to 450
2Blue450 to 510
3Green510 to 580
4Yellow585 to 625
5Red630 to 690
6Red edge705 to 745
7Near-IR1770 to 895
8Near-IR2860 to 1040

In this study, a Phantom 4 advanced drone equipped with an RGB camera was used to conduct aerial surveys over dikes 3 and 5 with eight ground control points. Drone flight conditions18 are detailed in Table 2. In this study, dike 3 is designated as the sample dike due to the image fusion between satellite and drone imagery being executed on it. On the other hand, dike 5 is deemed the representative dike, given that it is utilized to determine the mapping accuracy that represents the entire area of study. The survey of the sample dike resulted in a collection of 1210 original drone images. Meanwhile, the aerial survey of the representative dike obtained a total of 1455 original drone images.

Table 2

Drone flight conditions.

Height35 m
Speed12.9 km/h
Angle90 deg
Image overlapX 75%, Y 80%
Shutter interval2 s
Acquisition dateAugust 20, 2018

2.3.

Image Analysis

2.3.1.

Image pre-processing

Drone images were pre-processed using the Pix4Dmapper software. For the sample dike, RGB spectral bands with 0.1-m ground sampling distance (GSD) and an orthomosaic with 0.01-m GSD were generated. For the representative dike, only an orthomosaic with 0.01-m GSD was generated. In addition, WorldView-3 satellite bands and drone bands underwent pre-processing such as georeferencing, resampling, and pixel matching. The summary of pre-processed imagery used in this study is shown in Table 3.

Table 3

Summary of pre-processed imagery.

Target areaPre-processed imageInformation
Study areaWorldView-3 Satellite1.6-m RGB bands
0.4-m panchromatic band
Sample dike (dike 3)Drone0.1-m GSD RGB bands
0.01-m GSD orthomosaic
Representative dike (dike 5)Drone0.01 m GSD orthomosaic

2.3.2.

Image pan-sharpening

Pan-sharpening is a technique used to generate high-resolution multispectral images, which is achieved by merging low-resolution multispectral images with panchromatic ones.23 The process involves the fusion of a high-resolution panchromatic raster band with a lower-resolution multiband raster dataset. The outcome is a multiband raster dataset that matches the resolution of the panchromatic raster in the overlapping areas. Numerous pan-sharpening methods have been explored over the past several years. We used the Gram–Schmidt pan-sharpen method to enhance the 1.6-m resolution RGB bands with the 0.4-m panchromatic band of the WorldView-3 satellite image. Since its introduction in 1998, the Gram–Schmidt pan-sharpen method has gained popularity as one of the leading algorithms for pan-sharpening multispectral imagery and excels in enhancing image sharpness and reducing color distortion, outperforming most other pan-sharpened methods.24

2.3.3.

Image fusion

Image fusion was performed on an RGB-based band-by-band fusion between the pan-sharpened RGB bands of the satellite and the RGB bands of the drone on a sample dike. The “+” operator of the raster calculator in ArcGIS Pro was used to merge each corresponding band from the 0.4 m RGB pan-sharpened satellite and the 0.1-m GSD RGB drone. The statistical range of a pan-sharpened satellite raster was between 0 and 2047, whereas that of the drone raster was between 0 and 255.

For instance, a fused band is created using a mathematical expression denoted as Eq. (1) and expressed in Fig. 3. In this equation, the “+” operator signifies the pixel-wise addition of values from two raster datasets. The raster calculator extracts the pixel values from the band of the pan-sharpened satellite and adds them to the corresponding pixel values from the band of the drone. The outcome of this addition is stored in a new raster layer, termed the “fused band.” Consequently, each pixel in the “fused band” raster possesses a value equivalent to the sum of the corresponding pixel values from the bands of the pan-sharpened satellite and drone.

Fig. 3

Band fusion.

JARS_18_4_044505_f003.png

Mathematically, for each pixel (i,j), the operation is represented as follows:

Eq. (1)

Pan-sharpened satellite Band(i,j)+Drone Band(i,j)=Fused Band(i,j).

2.3.4.

Vegetation index image

Vegetation indices, which are linear combinations of bands, are highly effective in mapping AIPs.25,26 In this study, we used the Green–Red Vegetation Index (GRVI) as the vegetation index, as only RGB bands were used.27 The GRVI was computed according to Eq. (2) and was independently extracted from the original satellite, pan-sharpened satellite, and fused satellite–drone image

Eq. (2)

GRVI=Green bandRed bandGreen band+Red band.

2.3.5.

Object-based image analysis

Beyond the conventional pixel-based classification techniques, object-based image analysis (OBIA) categorizes imagery into objects using perception-based and meaningful knowledge.28 OBIA presents several benefits, encompassing the addition of multiparameter classification, improved performance with high-resolution data, and sharp images facilitating advanced image classification.29,30 Therefore, this study adopted OBIA for the classification of alien species distribution.

Using the Definiens Professional 5.0 software, images were segmented into objects according to the configuration18 as detailed in Table 4. The choice of scale parameter in the segmentation process is a key factor in determining the quality and accuracy of the results in OBIA and influences the size of image segments, landscape representation, and the success of the classification process. In this study, a 5.0-scale parameter was selected, which resulted in an average size of 19.28m2 per object. According to Table 5, a scale parameter of 5.0 provides a balance between detail and generalization. This parameter is more detailed than a scale parameter of 10.0 and 20.0, which results in larger objects, but less detailed than a scale parameter of 1.0, which results in the smallest objects and is equivalent to pixel size. Moreover, the vegetation patches in the study area were ˜20m2 in size on average; thus, a scale parameter of 5.0 would be suitable for this study.

Table 4

Image segmentation setting.

Segmentation modeMultiresolution segmentation
Scale parameter5.0
Color0.9
Shape0.1
Compactness0.5
Smoothness0.5

Table 5

Scale parameters comparison.

Scale parameterAverage size of objects (m2)
1.02.56
5.019.28
10.086.13
20.0348.80

After the image segmentation process, statistical data were extracted from each band across all images, according to each object of the segmented image. The data extraction used the zonal statistic function in quantum GIS (QGIS). The common types of extracted statistical data are mean, median, standard deviation (stdev), minimum (min), and maximum (max) values. Following the data extraction, the dataset was prepared so that each image of each dike was represented by the created dataset. A sample of the dataset is shown in Table 6. Consequently, the sample dike (dike 3) was represented by three datasets: original satellite, pan-sharpened satellite, and fused satellite–drone dataset. In contrast, the other river dikes were represented by only two datasets: the original satellite datasets and the pan-sharpened satellite datasets.

Table 6

Sample dataset.

Object IDRed bandGreen bandBlue bandGRVI band
First objectMean, median, stdev, min, and maxMean, median, stdev, min, and maxMean, median, stdev, min, and maxMean, median, stdev, min, and max
.....
.....
.....
Last objectMean, median, stdev, min, and maxMean, median, stdev, min, and maxMean, median, stdev, min, and maxMean, median, stdev, min, and max

2.4.

Ground Truth

Drones play three important roles: explanation, validation, and completion of satellite data. The validation role involves using drone data as a “ground truth” (or “drone truth”), relying on the capability of the drone to clarify satellite data.20 In this study, the ground truth data were based on the 0.01-m GSD drone orthomosaic,18 which was acquired for both the sample dike and the representative dike. Using these resources, the location and proportion of the target AIP were determined by visually constructing representative polygons, as depicted in Fig. 4. These polygons were then overlaid with the segmented image to determine the percentage of target alien species within each object. The percentages of target alien species within each object were classified into five categories,18 as illustrated in Table 7. These classes served as a benchmark for supervising the classification models and assessing the accuracy of mapping performances.

Fig. 4

Alien species location on sample dike (dike 3) and representative dike (dike 5).

JARS_18_4_044505_f004.png

Table 7

Classes of alien species.

Class of target alien speciesDescription of class
Class 00% of species
Class 10% to 25% of species
Class 225% to 50% of species
Class 350% to 75% of species
Class 475% to 100% of species

2.5.

Data Analysis

2.5.1.

Regression analysis

This study used a simple linear regression method to construct models that would align the pan-sharpened and fused satellite–drone datasets from a sample dike. The linear regression method was used as it allows for a direct correlation among individual variables in the datasets, based on historical observations. Although only the sample dike has the fused satellite–drone dataset, the goal of developing these regression models was to predict the fused satellite–drone dataset for other dikes within the study area. The effectiveness of these regression models was assessed using the coefficient of determination (R2), which measures the predictive power of the statistical model. Given that R2 values range between 0 and 1, any variables with an R2 value <0.5 in their regression models were excluded from subsequent steps.

2.5.2.

Classification analysis

In the remote sensing methodology for identifying invasive species, the choice of classifiers is essential. These classifiers should be capable of leveraging the information contained in the input datasets, be tailored to the objectives of the study, and provide optimal computational efficiency and processing time for classification tasks.8 A review on invasive species identification found that the random forest (RF) algorithm has consistently outperformed other classifiers in terms of statistical accuracy31 and classification speed.32 The RF algorithm works by creating a multitude of decision trees from a random subset of the training data. Each decision tree independently influences the final decision, and the algorithm combines the votes from each tree to arrive at the final prediction or classification.33

In this study, the RF classification algorithm was used for classification analysis. Classification models were independently constructed from the sample dike using three different datasets: original satellite, pan-sharpened satellite, and fused satellite–drone. The ground truth of the sample dike was used to supervise these models. The classification analysis was performed using orange data mining software. The RF classification models were developed with various parameter settings,18 as detailed in Table 8. Following the development of the models, they were evaluated using several metrics, including the Matthew correlation coefficient, recall, precision, F1 score, classification accuracy, and area under the receiver operating characteristic (ROC) curve (AUC). The performance of the model during training was then compared based on these metrics.

Table 8

Parameter settings within the random forest classification model.

Model buildingNumber of trees to build100
Tree growthSpecific maximum depth10
Minimum leaf node size1
Number of features to use for splittingAuto

2.5.3.

Alien species mapping

The AIP mapping process was performed by applying the developed RF classification model from the sample dike to the prepared datasets: original satellite, pan-sharpened satellite, and fused satellite–drone, of other river dikes within the study area. Following the mapping process, the mapping accuracy was assessed on the representative dike by comparing the results of the alien species mapping with ground truth data, and the accuracy assessment and comparison are based on the overall accuracy and kappa coefficient results. Based on these accuracy indices, we compare the mapping performance among the datasets: original satellite, pan-sharpened satellite, and fused satellite–drone. Finally, using ArcGIS Pro 3.2, the maps predicting the distribution of target alien species in the study area were produced.

3.

Results

The outcomes of the regression analysis are depicted in Fig. 5. The results revealed that six variables were linked to the regression model with an R2 value of <0.5. Consequently, these variables were omitted in the following steps. However, regression models of the 14 remaining variables were used to predict the fused satellite–drone datasets of the other dikes by applying these models to the pan-sharpened satellite datasets.

Fig. 5

Regression results (n=1263).

JARS_18_4_044505_f005.png

Three classification models were developed from the sample dike to classify the distribution of each target alien species, namely, S. altissima and S. halepense. For S. altissima species (refer to Table 9), the training results demonstrated that the fused satellite–drone dataset achieved the highest classification performance in almost all evaluation metrics, followed by the pan-sharpened satellite and the original satellite datasets. A similar pattern was observed for S. halepense species (see Table 10), where the fused satellite–drone dataset again achieved the highest classification performance in almost all evaluation metrics, followed by the pan-sharpened satellite and the original satellite datasets. From these results, it appears that the fused satellite–drone dataset consistently outperforms the other two datasets in classifying the distribution of both S. altissima and S. halepense.

Table 9

Evaluation of classification models for Solidago altissima species classification.

Evaluation metricFused satellite–dronePan-sharpened satelliteOriginal satellite
Area under ROC curve1.0001.0001.000
Classification accuracy0.9930.9890.972
F10.9930.9890.971
Precision0.9930.9890.973
Recall0.9930.9890.972
Matthew’s correlation coefficient0.9890.9810.954

Table 10

Evaluation of classification models for Sorghum halepense species classification.

Evaluation metricFused satellite–dronePan-sharpened satelliteOriginal satellite
Area under ROC curve1.0001.0001.000
Classification accuracy0.9880.9860.983
F10.9880.9850.982
Precision0.9880.9870.983
Recall0.9880.9860.983
Matthew’s correlation coefficient0.9080.8930.868

After the mapping was completed, the accuracy of the mapping performance was evaluated on the representative dike by comparing the results of the alien species mapping with the ground truth data. For S. altissima species (refer to Table 11), the mapping results derived from the fused satellite–drone dataset were the most alike across all classes compared with the ground truth. Similarly, for S. halepense species (see Table 12), the mapping results obtained from the fused satellite–drone dataset were the most comparable across all classes compared with the ground truth, although class 0 was predominantly represented. These comparisons indicate that the fused satellite–drone dataset demonstrates superior performance in AIP mapping.

Table 11

Comparison of Solidago altissima species mapping results on representative dike (n=3602).

Class of speciesGround truth, n (%)Fused satellite–drone, n (%)Pan-sharpened satellite, n (%)Original satellite, n (%)
Class 01842 (51.14)1900 (52.75)1996 (55.41)2029 (56.33)
Class 1685 (19.02)670 (18.60)680 (18.88)688 (19.10)
Class 2397 (11.02)381 (10.58)349 (9.69)361 (10.02)
Class 3394 (10.94)373 (10.36)358 (9.94)322 (8.94)
Class 4284 (7.88)278 (7.72)219 (6.08)202 (5.61)

Table 12

Comparison of Sorghum halepense species mapping results on representative dike (n = 3602).

Class of speciesGround truth, n (%)Fused satellite–drone, n (%)Pan-sharpened satellite, n (%)Original satellite, n (%)
Class 02833 (78.65)2912 (80.84)2961 (82.20)2974 (82.57)
Class 1392 (10.88)376 (10.44)361 (10.02)351 (9.74)
Class 2188 (5.22)160 (4.44)158 (4.39)155 (4.30)
Class 3114 (3.16)92 (2.55)76 (2.11)79 (2.19)
Class 475 (2.08)62 (1.72)46 (1.28)43 (1.19)

The accuracy of the target alien species mapping was determined based on OA and the kappa coefficient. The results highlighted that the use of the fused satellite–drone dataset to predict the distribution of S. altissima and S. halepense species resulted in a high mapping accuracy. For the mapping of S. altissima species (refer to Fig. 6), the use of the fused satellite–drone dataset achieved the highest OA of 98.39% and a kappa coefficient of 0.976. This was 9% higher in OA and 0.14 higher in the kappa coefficient compared with the pan-sharpened satellite dataset and 11% higher in OA and 0.17 higher in the kappa coefficient compared with the original satellite dataset. In the case of mapping S. halepense species (see Fig. 7), the use of the fused satellite–drone dataset achieved the highest OA of 97.78% and a kappa coefficient of 0.936. This was 4% higher in OA and 0.12 higher in the kappa coefficient compared with the pan-sharpened satellite dataset and 5% higher in OA and 0.14 higher in the kappa coefficient compared with the original satellite dataset. These results further validate the effectiveness of the fused satellite–drone dataset in improving AIP mapping accuracy.

Fig. 6

Accuracy assessment of Solidago altissima species mapping on representative dike.

JARS_18_4_044505_f006.png

Fig. 7

Accuracy assessment of Sorghum halepense species mapping on representative dike.

JARS_18_4_044505_f007.png

Finally, based on mapping results from the fused satellite–drone datasets, ArcGIS Pro 3.2 was used to create predictive maps to provide a visual representation of the distribution of the target AIP, S. altissima (refer to Fig. 8) and S. halepense (see Fig. 9). These maps highlight the large coverage of the S. altissima species within the study area, whereas the growth of the S. halepense species was relatively limited. These maps can be crucial for environmental management strategies, as they provide insights into the spatial distribution of these AIPs. This information can guide efforts to control the spread of these species and mitigate their potential impacts on local ecosystems.

Fig. 8

Predicted map of Solidago altissima species distribution.

JARS_18_4_044505_f008.png

Fig. 9

Predicted map of Sorghum halepense species distribution.

JARS_18_4_044505_f009.png

4.

Discussion

Improved mapping accuracy was achieved using a fused satellite–drone dataset to predict the coverage of S. altissima and S. halepense species and can be attributed to the complementary strengths of high-resolution satellite and drone imagery. WorldView-3 satellite imagery provides a broad landscape overview, capturing large-scale patterns and trends. However, its resolution may not be sufficient to detect fine-scale features, such as individual plant species on a river dike. In contrast, drone imagery, with its 0.1-m GSD, can capture these fine-scale features in notable detail. However, owing to the limited flight operation of drones, it may not be feasible to use drone imagery for mapping vast river dike environments. By fusing these two types of imagery, this study leverages the strength of both the large-scale coverage of satellite imagery and the fine-scale detail of drone imagery. This fusion results in a dataset that is both comprehensive in coverage and rich in detail, thereby enhancing the accuracy of mapping the coverage of S. altissima and S. halepense species. The pan-sharpened satellite dataset, while an improvement over the original satellite dataset in terms of resolution, still lacks the fine-scale detail provided by drone imagery. As such, the fused satellite–drone dataset outperforms the pan-sharpened satellite dataset. The original satellite dataset, with its lower resolution, is less capable of accurately mapping the coverage of AIPs, resulting in lower overall accuracy.

A previous study34 has acknowledged the cost and time-effectiveness of remote sensing as a technique for identifying alien vegetation. Furthermore, this study used not only high-resolution satellite imagery but also fused it with drone imagery to improve the AIP mapping accuracy. Fusing these data sources provides a comprehensive and detailed dataset, which leads to higher overall accuracy. Moreover, this study contributes to the growing body of literature35 supporting the use of advanced remote sensing techniques and machine learning models in improving the mapping accuracy of alien species distribution. The successful application of the RF classification model in this study adds to this narrative.

The research method exhibits strengths contributing to the findings of this study. Ai et al.12 described a method to improve the mapping accuracy of an invasive plant (Spartina alterniflora), which is based on the integration of pan-sharpening and classifier ensemble techniques. The parallel observations among our studies underscore the consistent benefits of pan-sharpening techniques in remote sensing applications, in the context of discriminating among closely related species. Moreover, the findings of Elkind et al.36 are closely related to the findings of this study, which underscored the importance of integrating DigitalGlobe WorldView-2 satellite imagery with UAV data for mapping non-native buffelgrass (Pennisetum ciliare). By leveraging the capabilities of both data sources, RF classification models for buffelgrass detection over larger areas achieved an average overall accuracy of 93%. Furthermore, OBIA methods improve classification performance when used with machine learning algorithms.37 Therefore, OBIA provided a systematic and effective way to process and analyze the high-resolution satellite and drone imagery, leading to accurate AIP mapping in this study. In addition, RF has been successfully applied to this study to map the distribution of alien species on river dikes. Meyer et al.38 used five supervised classification algorithms to estimate the total areas covered by the invasive dune species Carpobrotus edulis. Among these algorithms, the RF algorithm emerged as the most effective, demonstrating superior performance across various accuracy metrics. This highlighted the efficacy of the RF in accurately identifying and classifying invasive species, even in complex environments such as dune ecosystems. Therefore, our findings contribute to the growing body of knowledge on the application of RF in ecological monitoring and invasive species management.

In summary, this study not only aligns with the existing studies but also provides new insights and advancements in remote sensing for mapping alien species distribution. It highlights the potential of fusing high-resolution satellite and drone imagery for enhancing mapping accuracy, thereby contributing to effective vegetation management on river dikes, particularly in terms of the conservation of native species and the control of invasive alien species. However, in the context of generalization, the methodology and findings of this study are so far limited to the river dikes along Tone River and two alien species, S. altissima and S. halepense. The effectiveness and reliability of the methodology and application of this study may vary in different contexts, landscapes, and species.

5.

Conclusion

We demonstrate the significant potential of fusing high-resolution satellite and drone imagery for mapping AIP distribution, namely, S. altissima and S. halepense, on the river dikes environment along Tone River. We used a combination of various processes, including pan-sharpening of WorldView-3 satellite images, fusion with drone images, object-based image analysis, regression analysis, and the RF classification model. The key findings of this study underscore the significant contribution of these techniques in enhancing the mapping accuracy of AIP distribution. The fused satellite–drone dataset outperformed the pan-sharpened satellite dataset and the original satellite dataset in terms of overall accuracy and kappa coefficients for both species. This indicates that the fusion of high-resolution satellite and drone imagery provides more accurate and reliable data for mapping the distribution of alien species on river dikes. These findings have important implications for the management of herbaceous vegetation on river dikes. With improved mapping accuracy, it is possible to obtain a comprehensive understanding of the distribution of native grasses and AIPs along expansive river dikes. This can significantly influence the conservation of native species and the maintenance of river dike visibility for safety inspection.

Disclosures

The authors have no relevant financial interests in the paper and no other potential conflicts of interest.

Code and Data Availability

All data in support of the findings of this paper are available upon request from the corresponding author.

Acknowledgments

This study was supported by Japan Society for the Promotion of Science Grants-in-Aid for Scientific Research (JSPS KAKENHI); Grant No. JP20H03015.

References

1. 

Y. Sasaki, S. Kano and O. Matsuo, “Research and practices on remedial measures for river dikes against soil liquefaction,” J. Jpn. Assoc. Earthquake Eng., 4 (3), 312 –335 https://doi.org/10.5610/jaee.4.3_312 (2004). Google Scholar

2. 

Z. Bátori et al., “River dikes in agricultural landscapes: the importance of secondary habitats in maintaining landscape-scale diversity,” Wetlands, 36 (2), 251 –264 https://doi.org/10.1007/s13157-016-0734-y (2016). Google Scholar

3. 

H. Rim, M. Lee and U. Song, “Mowing inhibits the invasion of the alien species Solidago altissima and is an effective management strategy,” Manage. Biol. Invasions, 14 (1), 63 –79 https://doi.org/10.3391/mbi.2023.14.1.03 (2023). Google Scholar

4. 

M. Yang et al., “Reconstructed global invasion and spatio-temporal distribution pattern dynamics of Sorghum halepense under climate and land-use change,” Plants, 12 (17), 3128 https://doi.org/10.3390/plants12173128 (2023). Google Scholar

5. 

K. R. L. Saranya, K. V. Satish and C. S. Reddy, “Remote sensing enabled essential biodiversity variables for invasive alien species management: towards the development of spatial decision support system,” Biol. Invasions, 26 (4), 943 –951 https://doi.org/10.1007/s10530-023-03240-y (2024). Google Scholar

6. 

N. Miura et al., “Classification of grass and forb species on riverdike using UAV LiDAR-based structural indices,” Int. J. Autom. Technol., 15 (3), 268 –273 https://doi.org/10.20965/ijat.2021.p0268 (2021). Google Scholar

7. 

C. Y. Huang and G. P. Asner, “Applications of remote sensing to alien invasive plant studies,” Sensors, 9 (6), 4869 –4889 https://doi.org/10.3390/s90604869 SNSRES 0746-9462 (2009). Google Scholar

8. 

L. Royimani et al., “Advancements in satellite remote sensing for mapping and monitoring of alien invasive plant species (AIPs),” Phys. Chem. Earth, 112 237 –245 https://doi.org/10.1016/j.pce.2018.12.004 PCEAAV 0079-1946 (2019). Google Scholar

9. 

Z. Ngubane et al., “Assessment of the contribution of WorldView-2 strategically positioned bands in Bracken fern (Pteridium aquilinum (L.) Kuhn) mapping,” South Afr. J. Geomat., 3 (2), 210 https://doi.org/10.4314/sajg.v3i2.7 (2014). Google Scholar

10. 

Y. Zhang et al., “Coastal wetland vegetation classification with a Landsat Thematic Mapper image,” Int. J. Remote Sens., 32 (2), 545 –561 https://doi.org/10.1080/01431160903475241 IJSEDK 0143-1161 (2011). Google Scholar

11. 

S. Ashraf, L. Brabyn and B. J. Hicks, “Image data fusion for the remote sensing of freshwater environments,” Appl. Geogr., 32 (2), 619 –628 https://doi.org/10.1016/j.apgeog.2011.07.010 (2012). Google Scholar

12. 

J. Ai et al., “Integrating pan-sharpening and classifier ensemble techniques to map an invasive plant (Spartina alterniflora) in an estuarine wetland using Landsat 8 imagery,” J. Appl. Remote Sens., 10 (2), 26001 https://doi.org/10.1117/1.JRS.10.026001 (2016). Google Scholar

13. 

K. Anderson and K. J. Gaston, “Lightweight unmanned aerial vehicles will revolutionize spatial ecology,” Front. Ecol. Environ., 11 (3), 138 –146 https://doi.org/10.1890/120150 (2013). Google Scholar

14. 

M. Bryson et al., “Cost-effective mapping using unmanned aerial vehicles in ecology monitoring applications,” Springer Tracts Adv. Robot., 79 509 –523 https://doi.org/10.1007/978-3-642-28572-1_35 (2014). Google Scholar

15. 

A. Rango et al., “Using unmanned aerial vehicles for rangelands: current applications and future potentials using unmanned aerial vehicles for rangelands 159,” (2006). Google Scholar

16. 

C. Akandil et al., “Mapping invasive giant goldenrod (Solidago gigantea) with multispectral images acquired by unmanned aerial vehicle,” J. Digit. Landscape Archit., 2021 (6), 245 –256 https://doi.org/10.14627/537705021 (2021). Google Scholar

17. 

A. P. Cracknell, “UAVs: regulations and law enforcement,” Int. J. Remote Sens., 38 (8–10), 3054 –3067 https://doi.org/10.1080/01431161.2017.1302115 IJSEDK 0143-1161 (2017). Google Scholar

18. 

M. R. So, S. Yokota and N. Miura, “Data fusion of drone and satellite imagery for alien species classification on riverdike,” in 44th Asian Conf. Remote Sens., (2023). Google Scholar

19. 

C. Kuenzer, S. Dech and W. Wagner, “Remote sensing and digital image processing remote sensing time series,” http://www.springer.com/series/6477 (2015). Google Scholar

20. 

E. Alvarez-Vanhard, T. Corpetti and T. Houet, “UAV & satellite synergies for optical remote sensing applications: a literature review,” Sci. Remote Sens., 3 100019 https://doi.org/10.1016/j.srs.2021.100019 (2021). Google Scholar

21. 

S. Yamada and M. Nemoto, “Effects of bare-ground revegetation techniques using Imperata cylindrica on changes in the plant cover and species richness during early succession,” Open J. Ecol., 6 (8), 471 –483 https://doi.org/10.4236/oje.2016.68045 (2016). Google Scholar

22. 

M. Al Sakran et al., “Effect of aqueous extract of Sorghum halepense (L.) Pers. on germination and growth of some weed species,” Int. J. Sci. Res. Publ., 11 (1), 404 –408 https://doi.org/10.29322/ijsrp.11.01.2021.p10946 (2020). Google Scholar

23. 

J. Ma et al., “Pan-GAN: an unsupervised pan-sharpening method for remote sensing image fusion,” Inf. Fusion, 62 110 –120 https://doi.org/10.1016/j.inffus.2020.04.006 (2020). Google Scholar

24. 

T. Maurer, “How to pan-sharpen images using the Gram-Schmidt pan-sharpen method—a recipe,” Remote Sens. Spatial Inf. Sci., XL-1/W1 239 –244 https://doi.org/10.5194/isprsarchives-XL-1-W1-239-2013 (2013). Google Scholar

25. 

M. T. Gómez-Casero et al., “Spectral discrimination of wild oat and canary grass in wheat fields for less herbicide application,” Agron. Sustain. Dev., 30 (3), 689 –699 https://doi.org/10.1051/agro/2009052 (2010). Google Scholar

26. 

L. W. Lass et al., “A review of remote sensing of invasive weeds and example of the early detection of spotted knapweed (Centaurea maculosa) and babysbreath (Gypsophila paniculata) with a hyperspectral sensor,” Weed Sci., 53 242 –251 https://doi.org/10.1614/WS-04-044R2 WEESA6 0043-1745 (2005). Google Scholar

27. 

T. Motohka et al., “Applicability of Green-Red Vegetation Index for remote sensing of vegetation phenology,” Remote Sens., 2 (10), 2369 –2387 https://doi.org/10.3390/rs2102369 (2010). Google Scholar

28. 

T. Blaschke, “Object based image analysis for remote sensing,” ISPRS J. Photogramm. Remote Sens., 65 (1), 2 –16 https://doi.org/10.1016/j.isprsjprs.2009.06.004 IRSEE9 0924-2716 (2010). Google Scholar

29. 

R. L. Cornett and E. G. Ernenwein, “Object-based image analysis of ground-penetrating radar data for archaic hearths,” Remote Sens., 12 (16), 2539 https://doi.org/10.3390/rs12162539 (2020). Google Scholar

30. 

I. Dronova, “Object-based image analysis in wetland research: a review,” Remote Sens., 7 (5), 6380 –6413 https://doi.org/10.3390/rs70506380 (2015). Google Scholar

31. 

R. L. Lawrence, S. D. Wood and R. L. Sheley, “Mapping invasive plants using hyperspectral imagery and Breiman Cutler classifications (randomForest),” Remote Sens. Environ., 100 (3), 356 –362 https://doi.org/10.1016/j.rse.2005.10.014 (2006). Google Scholar

32. 

R. Sluiter and E. J. Pebesma, “Comparing techniques for vegetation classification using multi- and hyperspectral images and ancillary environmental data,” Int. J. Remote Sens., 31 (23), 6143 –6161 https://doi.org/10.1080/01431160903401379 IJSEDK 0143-1161 (2010). Google Scholar

33. 

L. Breiman, “Random forests,” Mach. Learn., 45 (1), 5 –32 https://doi.org/10.1023/A:1010933404324 MALEEZ 0885-6125 (2001). Google Scholar

34. 

L. Rowlinson, M. Summerton and F. Ahmed, “Comparison of remote sensing techniques for alien vegetation mapping,” in Proc. 1998 South African Symp. Commun. Signal Process. (COMSIG '98), 475 –476 (1998). Google Scholar

35. 

A. S. Vaz et al., “The many roles of remote sensing in invasion science,” Front. Ecol. Evol., 7 370 https://doi.org/10.3389/fevo.2019.00370 (2019). Google Scholar

36. 

K. Elkind et al., “Invasive buffelgrass detection using high-resolution satellite and UAV imagery on Google Earth Engine,” Remote Sens. Ecol. Conserv., 5 (4), 318 –331 https://doi.org/10.1002/rse2.116 (2019). Google Scholar

37. 

O. S. Azeez et al., “Integration of object-based image analysis and convolutional neural network for the classification of high-resolution satellite image: a comparative assessment,” Appl. Sci., 12 (21), 10890 https://doi.org/10.3390/app122110890 (2022). Google Scholar

38. 

M. de F. Meyer et al., “Application of a multispectral UAS to assess the cover and biomass of the invasive dune species Carpobrotus edulis,” Remote Sens., 15 (9), 2411 https://doi.org/10.3390/rs15092411 (2023). Google Scholar

Biography

Mony Rith So is currently pursuing his master’s degree at Graduate School of Environmental and Information Studies, Tokyo City University, in Japan. His current research focus is on the application of remote sensing and geographic information system (GIS) for vegetation analysis in river dike environments, which aims to conserve native ecosystems and maintain dike structures. Further, he is interested in further applying remote sensing and GIS techniques for green space conservation and ecosystem management.

Shigehiro Yokota is currently a professor at the Department of Restoration Ecology and Built Environment, Faculty of Environmental Studies, Tokyo City University, Japan. He specializes in urban ecological planning, integrating urban ecology, and ecological planning. He focuses on research regarding the planning, evaluation, and management of green infrastructure that balances the water cycle and ecosystem, specifically targeting watersheds and catchment areas.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 International License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Mony Rith So and Shigehiro Yokota "Alien species distribution mapping using satellite and drone fusion on river dikes along the Tone River, Japan," Journal of Applied Remote Sensing 18(4), 044505 (8 October 2024). https://doi.org/10.1117/1.JRS.18.044505
Received: 8 May 2024; Accepted: 12 September 2024; Published: 8 October 2024
Advertisement
Advertisement
KEYWORDS
Satellites

Image fusion

Satellite imaging

Earth observing sensors

Associative arrays

Vegetation

Image segmentation

Back to Top