Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) is a valuable modality for evaluating breast abnormalities found in mammography and performing early disease detection in high-risk patients. However, images produced by various MRI scanners (e.g., GE Healthcare & Siemens) differ in terms of intensity and other image characteristics such as noise distribution. This is a challenge both for the evaluation of images by radiologists and for the computational analysis of images using radiomics or deep learning. For example, an algorithm trained on a set of images acquired by one MRI scanner may perform poorly on a dataset produced by a different scanner. Therefore, there is an urgent need for image harmonization. Traditional image to image translation algorithms can be used to solve this problem, but they require paired data (i.e. the same object imaged using different scanners). In this study, we utilize a deep learning algorithm that uses unpaired data to solve this problem through a bi-directional translation between MRI images. The proposed method is based on a cycle-consistent adversarial network (CycleGAN) that uses two generator-discriminator pairs. The original CycleGAN struggles in preserving the structure (i.e. breast tissue characteristics and shape) during the translation. To overcome this, we modified the discriminator architecture and forced the penalization based on the structure at the scale of smaller patches. This allows the network to focus more on features pertaining to breast tissue. The results demonstrate that the transformed images are visually realistic, preserve the structure and harmonize intensity across images from different scanners.