The location of a single point source in infrared imaging is typically achieved through conventional methods such as centroiding.
More challenging problems with multiple point sources require alternative location-finding methods with the potential of resolving closely spaced objects. The authors introduce an algorithm predicated on least-squared-error (LSE) modeling with a Gram-Schmidt orthogonalization step. Its noise performance is compared with two other high-resolution algorithms based on the eigendecomposition of the input data. Estimates obtained through the LSE modeling approached the Cramer-Rao lower bound for high signal-to-noise ratios. However, its performance is severely degraded in the presence of non-Gaussian noise. An outlier detection scheme that may be used in conjunction with the location and amplitude estimation procedure is described, Its effectiveness is demonstrated through Monte Carlo simulations.