With the development of sensor technologies, imaging technology is developing more rapidly. What followed was the widespread use of image processing technology in many kinds of applications. For instance, image processing technology has been widely used in video surveillance, medical diagnosis, remote sensing detection and object tracking. As a sub-field of image processing technology, image fusion is the one of most studied technology. The aim of image fusion is to acquire an integrated image that contains more information. This integrated image is more conductive for a human or a machine to understand and mine the information contained in the image. In all kinds of image fusion, infrared (IR) and visible (VIS) image fusion is one of the most valuable multisource image fusion. When imaging the same scene using both IR and VIS imaging system, more information can be obtained, but more redundant information is generated. The IR sensor acquires the thermal radiation information of the object in a scene, so the object can also be detected when the lighting conditions are poor. The image acquired by VIS light sensors has more spectral information, clearer texture details, and higher spatial resolution. Thus, the scene can be described more completely by integrating the IR and VIS images into one image. Meanwhile, the scene can be readily understood by observers, and the information of the scene can be easily perceived. In this paper, an effective IR and VIS image fusion via non-subsampled shearlet transform (NSST) and pulse-coupled neural network (PCNN) in multi-scale morphological gradient (MSMG) domain is proposed. First, low frequency sub-image and high frequency sub-images are obtained through NSST. Then, the low frequency sub-image and high frequency sub-images are fused via a MSMG domain PCNN (MSMG-PCNN) strategy. Finally, the fused image is reconstructed by inverse NSST. Experimental results demonstrate that the proposed MSMG-PCNN-NSST algorithm performs effectively in most cases by qualitative and quantitative evaluation.