23 December 1999 Efficient image retrieval approaches for different similarity requirements
Author Affiliations +
Proceedings Volume 3972, Storage and Retrieval for Media Databases 2000; (1999); doi: 10.1117/12.373580
Event: Electronic Imaging, 2000, San Jose, CA, United States
Abstract
The amount of pictorial data grows enormously with the expansion of the WWW. From the large number of images, it is very important for users to retrieve desired images via an efficient and effective mechanism. In this paper we prose two efficient approaches to facilitate image retrieval by using a simple method to represent the image content. Each image is partitioned into m X n equal-sized sub-images. A color that has enough number of pixels in a block is extracted to represent its content. In the first approach, the image content is represented by the extracted colors of the blocks. The spatial information of images is considered in image retrieval. In the second approach, the colors of the blocks in an image are used to extract objects. A block- level process is process is proposed to perform the region extraction. The spatial information of regions is considered unimportant in image retrieval. Our experiments show that these two block-based approaches can speed up the image retrieval. Moreover, the two approaches are effective for different requirements of image similarity. Users can choose a proper approach to process their queries based on their similarity requirements.
© (1999) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Chiou-Yann Tsai, Arbee L. P. Chen, Kai Essig, "Efficient image retrieval approaches for different similarity requirements", Proc. SPIE 3972, Storage and Retrieval for Media Databases 2000, (23 December 1999); doi: 10.1117/12.373580; https://doi.org/10.1117/12.373580
PROCEEDINGS
12 PAGES


SHARE
KEYWORDS
RGB color model

Image retrieval

Databases

Quantization

Visualization

Feature extraction

Adaptive optics

RELATED CONTENT


Back to Top