When humans observe stereoscopic images, visual discomfort may be experienced in the form of physiological symptoms such as eyestrain, a feeling of pressure in the eyes, headaches, neck pain, and more. These sensations can arise in cortical mechanisms related to early visual processing. For example, vergence eye movements and lens accommodation can provide conflicting information to the brain if the stereo images are distorted or presented on a flat display. Over the past decade, significant effort has been applied to understanding and characterizing how discomfort arises, towards being able to design safer and more comfortable 3D displays and to provide better guidelines on how to design, align, and capture 3D images and videos. Part of solving this problem consists of objectively predicting the visual discomfort that may arise from viewing a given pair of stereo images that are distorted. Researchers have built several models based primarily on cortical mechanisms that yield good predictions of visual discomfort. Here we study the role of natural scene statistics (NSS) of the disparity maps of stereoscopic images and their relationship to 3D visual discomfort. In particular, we focus on bivariate NSS models. We also build a new prediction model that combines information from binocular vision and the NSS models of disparity maps to accurately predict 3D visual discomfort, and we demonstrate that an algorithm that realizes the prediction outperforms other existing predictors.