It is well known that detector response nonuniformity results in pattern noise with staring sensors that is a severe problem in the infrared due to the low intrinsic contrast of IR imagery. The pattern noise can be corrected by electronic processing; however, the ability to correct for pattern noise is limited by the interaction of interscene and intrascene variability with the dynamic range of the processor (number of bits) and, depending upon the algorithm used, by nonlinearities in the detector response. This paper quantifies these limitations and describes the interaction of detector gain nonuniformity and detector nonlinearities. Probabilistic models are developed to determine the maximum sensitivity that can be obtained using a two-point algorithm to correct a nonlinear response curve over a wide temperature range. Curves that permit a prediction of the noise equivalent differential temperature (NEAT) under varying circumstances are presented. A piecewise linear approach to dealing with severe detector response nonlinearities is presented and analyzed for its effectiveness.