Color is an important feature for object recognition in security and military applications. Unfortunately, color is sensitive to the environmental operating conditions so its use for automatic target recognition is often limited. Recently a number of research efforts have focused on techniques for developing algorithms to improve color constancy across images. Many of these approaches attempt to improve the color constancy of a particular type of surface area such as skin. In contrast, we present an approach that attempts to address color constancy of many surfaces across a wide range of external environmental conditions in the absence of direct knowledge of illumination. Our approach builds on existing techniques by using evolutionary learning to synthesize features that characterize the illuminations that influence perception of color. Once the illumination of each image in a collection is estimated, it can be used to map the colors in an image to the illumination conditions in any other image. This would allows us to take an image from that collection, transform its colors to reference colors that can then be combined with other types of features (e.g. geometrical, statistical, and textural) to cerate automatic target recognition systems that are relatively insensitive to their operating conditions. To demonstrate our technique, we process images of a parking area under a wide variety of seasonal weather conditions collected across large timescales of hours, days, and months.