This paper presents a technique for fusing radar and imaging sensor data for target tracking. First, it is shown how angular position measurements can be computed from the two-dimensional intensity pattern provided by the imaging sensor. Then a model for the angular measurements is developed and accurate noise statistics are given. The measurement model that is derived differs from the one usually used in data fusion applications involving imaging sensors. In particular, it is shown that the noise is signal dependent and this affects the filtering equations which form the basis of the fusion operation. It is also shown that the two angular coordinates should not be decoupled because of the noise cross-correlation. The relationship between the imaging sensor’s angular measurements and the radar’s angular measurements is then explored by considering a simple example of a two-point target. It is shown that we cannot assume in general that the two sensors are measuring the same target point and that it is necessary to accommodate the difference between the two angular measurements within the data fusion process. A simple technique for accommodating this difference is developed and the equations necessary for fusing such measurements within the filtering framework are then presented. Finally, at the end of the paper it is pointed out that the same techniques can be applied to data fusion when tracking targets at very long ranges. The techniques developed in this paper are evaluated using a computer simulation with simulated data.