Recently, optimal algorithms for locating a target on nonoverlapping background, based on maximum likelihood approach, have been designed. In particular, different ways of modeling the target have been proposed. When the gray levels of the target are known, the reference of the target can be modeled as a deterministic function. On the other hand, when the gray levels of the target in the input image are unknown or can vary from one image to another one, the reference of the target has to be considered as a pattern with random gray levels. Moreover, it is possible to unify both the deterministic and the random target approaches into a single model, where the target is modeled using a linear combination of deterministic values and random variables. Based on this model, we propose to design an algorithm that optimizes the likelihood ratio between the two hypothesis that a target is present and that it is absent within a small sub-window of the image. We show that this technique is more efficient than the maximum likelihood approach when the noise statistic of the background is strongly nonhomogeneous, which is the case in many real-world images. The presented algorithm is based on correlations and it can thus be implemented in an architecture using optical correlators.