As new remote sensing systems are deployed, we will see an increase in the amount of image data available at different wavelengths. Also, images from a single sensor over the same area often exhibit clouds, forcing analysts to switch among several images or to mosaic the images by manually defining cutlines to eliminate clouds. The ability to fuse multiple images over the same area, and to have the fused product exhibit, in a single image, the important details visible in individual bands has become crucial in dealing with the large volume of data available. We describe an approach to image fusion using the wavelet transform. When images are merged in wavelet space, we can process different frequency ranges differently. For example, high frequency information from one image can be combined with lower frequency information from another, for performing edge enhancement. We have built a prototype system that allows experimentation with various wavelet array combination and manipulation methods for image fusion, using a set of basic operations on wavelet frequency blocks. Problems caused by image misregistration and processing artifacts are described. Examples of wavelet fusion results are shown, along with test images that clarify behavior of the wavelet fusion methods used.