17 January 2005 A new identification method for artificial objects based on various features
Author Affiliations +
Artificial object identification and image classification are two basic issues in remote sensing (RS) information extraction. All kinds of methods, from the pixel-based to window-based, have been tried respectively in two domains for many years, but the accuracy is not well until now. Two obvious limitations explain the reasons. One is that the processing cell can’t correspond with the true target in the real world, and the other is the feature, which participates in the identification procedure, is far from enough to describe the intrinsical characteristics of the object of interest. During recent two years, an object-oriented classification method is put forward to supply these gaps of the conventional classification method. On the one hand, by using the segmentation technique, the pixel clusters are extracted based on their similarities to form so-called object having thematic meaning; on the other hand, vectorization of these objects is performed by integrating the GIS (geographical information system) idea into the RS which makes it possible to describe the various features of each object, such as shape information and its spatial relationship to neighboring object. In this study, the authors attempt to use this new method to artificial object identification by taking example for ship extraction with the help of one spectral feature and eight shape features. Results indicate the object-oriented classification is feasible in practice, and it opens a new way for artificial object extraction.
© (2005) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Linli Cui, Ping Tang, Zhongming Zhao, and Jun Shi "A new identification method for artificial objects based on various features", Proc. SPIE 5667, Color Imaging X: Processing, Hardcopy, and Applications, (17 January 2005); doi: 10.1117/12.584865; https://doi.org/10.1117/12.584865

Back to Top