Natural scenes are not band-limited and for most contemporary sampled imaging systems, the (pre-sampling) image formation subsystem frequency response extends well beyond the sampling passband. For these two reasons, most sampled imaging systems -- particularly staring-array systems -- produce aliasing. That is, the sampling process causes (high) spatial frequencies beyond the sampling passband to fold into (lower) spatial frequencies within the sampling passband. When the aliased, sampled image data is then reconstructed, usually by image display, potentially significant image degradation can be produced. This is a well- established theoretical result which can be (and has been, by many) verified experimentally. In this paper we argue that, for the purposes of system design and digital image processing, aliasing should be treated as signal-dependent, additive noise. The argument is both theoretical and experimental. That is, we present a model-based justification for this argument. Moreover, by using a computational simulation based on this model, we process (high resolution images of) natural scenes in a way which enables the `aliased component' of the reconstructed image to be isolated unambiguously. We demonstrate that our model-based argument leads naturally to system design metrics which quantify the extent of aliasing. And, by illustrating several `aliased component' images, we provide a qualitative assessment of aliasing as noise.