Segmentation of an image based on texture can be performed by a set of N Gabor filters that uniformly covers the spatial frequency domain. The filter outputs that characterize the frequency and orientation content of the intensity distribution in the vicinity of a pixel constitute an N-element feature vector. As an alternative to the computationally intensive procedure of segmentation based on the N-element vectors generated at each pixel, we propose an algorithm for selecting a pair of filters that provides maximum discrimination between two textures constituting the object and its surroundings in an image. Images filtered by the selected filters are nonlinearly transformed to produce two feature maps. The feature maps are smoothed by an intercompetitive and intracooperative interaction process between them. These interactions have proven to be much superior to simple Gaussian filtering in reducing the effects of spatial variability of feature maps. A segmented binary image is then generated by a pixel-by-pixel comparison of the two maps. Resuits of experiments involving several texture combinations show that this procedure is capable of producing clean segmentation.