In this paper, we examine the problem of non-rigid, image-to-image registration for CT images of the abdomen. This problem has been previously addressed in many different contexts (e.g., visualization using different imaging modalities, modelling of organ deformation after surgical resection). The particular application in which we are interested is modelling of respiratory motion of abdominal organs, so that we may achieve a more accurate representation of the dose distribution in both targeted structures and non-targeted areas during radiosurgical treatment. Our goal is to register two CT images, each acquired at different positions in the respiratory cycle. We use a transformation model based on B-splines, and take advantage of B-splines' "locality" or "compact support" property to ensure computational efficiency and robust convergence to a satisfactory result. We demonstrate that, although a purely intensity-based registration metric performs well in matching the deformation of the lungs during the respiratory cycle, the movement of other organs (e.g., liver and kidney) is poorly represented due to the poor contrast within and between these organs in the CT images. We introduce a registration metric that is a weighted combination of intensity difference and distance between corresponding points that are manually identified in the two images being registered; we show how the influence of these points can be elegantly added to the metric so that it remains differentiable with respect to the spline control points. Visual inspection reveals that resulting registrations appear to be superior to the intensity-only ones in terms of representation of visceral organ deformation and movement.