Relevance feedback in content-based image retrieval has been an active research focus for many years. It uses user-labeled information to re-adjust the measurement of similarity between images to get the improved retrieval results. In this paper we propose a simple and effective approach for image relevance feedback, which uses both positive and negative examples labeled by users to refine the query and update the distance measurement dynamically. Our method not only has a very low complexity but also adapts well to the changes of user’s retrieval interests. Experimental results on a database of 7,000 images represented by MPEG-7 color and texture descriptors show the efficiency of our algorithm compared with other two existing algorithms.