In this article, we present two mathematical paradigms for clutter modeling. Both paradigms pose clutter modeling as a statistical inference problem, and pursue probabilistic models for characterizing observed training images. The two paradigms differ in the forms (or families) of models that they choose and in their philosophical assumptions on real world clutter patterns. The first paradigm studies descriptive models, such as Markov random field (MRF) models and the minimax entropy models (Zhu, Wu, and Mumford 1997). In this modeling paradigm, image features are first extracted from images, and statistics of these features are calculated. The latter define an image ensemble-called the Julesz ensemble which is an equivalence class where all images share the same feature statistics. For any large images from this ensemble, a local patch given its boundary condition is then Gibbs (or MRF) models. We shall review the recent conclusions about ensemble equivalence studied in (Wu, Zhu and Liu, 1999). The second paradigm studies generative model, such as the random collage model (Lee and Mumford, 1999). In contrast to a descriptive model, a generative model introduces hidden variables which are assumed to be the underlying causes producing the observed image. For example, trees and rock for clutter. The learning process makes inference about the hidden variables. We shall discuss a texton model for clutter and effective Markov chain Monte Carlo methods for stochastic inference. We shall also reveal the deep relationship between the two modeling paradigm.