Statistical target recognition techniques perform well when the `true' target-signature data are deterministic. If the deterministic nature of the data exists and the probabilistic values of a given problem are known, then identifiers, based on Bayesian estimation theory, have great potential for solving the target identification problem. How well an identifier will perform is usually answered by Monte Carlo simulations or implementation experiments. An alternative to these performance analysis techniques is the use of information theory. Information theory has long been applied to the investigation of data compression, which deals with average distortional measures. The association of Bayes risk with distortion allows for information- theoretic tools to be applied to the statistical target identification problem. The rate-distortion function of data compression can be extended to the Bayes rate-distortion function. Derived from designer-specified risk and identifier structure, the theoretical Bayes rate-distortion function relates mutual information to the identifier performance. Mutual information is determined from the target-signature data used by the identifier and by the nature of the noise imposed upon the data. Computational efforts in determining mutual information values allow for identifier performance bounds to be extracted from the Bayes rate-distortion function.