A number of applications require the precise tracking or position estimation of an object unresolved in the system optics. This paper evaluates several one-dimensional interpolation algorithms (odd N-point centroids, N = 3, 5, 7, 9, and three-point and five-point quadratic curve fits) designed to make these estimates to subpixel accuracy. Analytic, Monte Carlo, and experimental results are presented. The tracking sensor examined was a scanning linear array of infrared detectors assumed to be background-limited. The detector size and physical spacing were varied parametrically, with realistic fabrication constraints, to determine the relative performance and to obtain the optimum configuration. The optics blur spot was assumed Gaussian. The sources of error considered to affect the algorithm performance were the systematic algorithm bias, the random noise, and the postcalibration residual detector responsivity nonuniformities. Track accuracy improves with signal-to-noise ratio (SNR), until limited by algorithm inaccuracies or focal-plane nonuniformity. Among the algorithms tested, the three-point centroid performs best, provided that systematic algorithm bias is corrected. An experimental infrared tracking focal plane, used in a tracker simulation, closely confirmed the analysis. With the three-point algorithms, an experimental accuracy to smaller than 1/100 a detector (<1/250 a blur spot) was obtained at high signal-to-noise ratios.