An uncertain system yields random outputs even for deterministic inputs. Consequently, design and analysis of such systems require faithful models of this uncertainty. In a statistical framework, these models are typically specified by probability distributions, which are often unknown a priori and must therefore be estimated from available data or information. Unfortunately, the observed uncertainty may not be uniquely specified by a single distribution. In this case, unique densities are obtained by constraining candidate densities to satisfy additional optimality criteria. Motivated by the principles of information theory, we now detail how entropy may be used as one such constraint.