The effect of reducing the resolution of a phase-only filter (POF) with respect to the target Fourier transform, while maintaining a constant filter bandwidth, is investigated. An existing procedure for the optimal design of filters with constrained complex amplitude values is modified by imposing an additional constraint on the filter resolution, and is used to design an optimal POF under such restrictions. A simple analysis is performed for an idealized target, which shows that the correlation peak magnitude significantly decreases when the filter resolution, in terms of number of subregions (pixels), falls below the target size, measured in terms of the number of smallest resolvable elements (pixels), in the input. The applicability of this analysis for real IR imagery is verified by simulation. The dependence of the correlation peak degradation, with filter resolution, on the nature of the imagery is also investigated, and it is shown that, for a given image size, high-pass filtered images are marginally more robust to filter resolution reduction than normal imagery. Finally, the effectiveness of the optimizing procedure is demonstrated by comparing the amount of filter resolution reduction that can be achieved by this procedure with that achieved by an existing nonoptimal technique.