Resistive array technology is finding increasing application in representing synthetic infrared targets and backgrounds. Pixel-to-pixel radiance nonuniformity is the prominent noise source from resistive arrays and must be compensated or otherwise mitigated for high-fidelity testing of infrared imaging sensors. Any imaging method for measuring and correcting nonuniformity noise is subject to theoretical performance limitations due to sensor measurement noise, geometrical resolution, background offset, and optical resolution. We derive general performance bounds as functions of sensor parameters, which are equally applicable to staring and scanning nonuniformity correction (NUC) sensors. A thorough understanding of the theoretical limitations of the process allows intelligent specification of the NUC sensor, procedures, algorithms, and processing power required for any scene projection application. We describe the NUC approach developed for the US Army's Dynamic Infrared Scene Projector (DIRSP). We also exhibit the features of our software package, which calculates emitter calibration curves from automatically collected laboratory data. We show how the code deals with practical considerations such as detector fill factor, incorrect magnification, rotation between emitter array and NUC sensor, optics anisoplanity, dead pixels, data overload, and automatic detection of emitter signals. The paper concludes by showing a glimpse of procedures, algorithms, and sensor to be used in the DIRSP.