An analytical study of environmental and clutter effects on microwave radiometers used for the detection of buried objects is presented. To simplify the analysis, it is assumed that the soil/target medium has a constant physical temperature versus depth, so that Kirchhoff's law can be applied to determine emissivities, and a simple layered medium geometry is used to model a buried target. Changes in brightness temperatures which result due to the present of a buried target are illustrated for varying soil dielectric properties, radiometer frequencies, and target depths, and are contrasted with changes in brightness temperatures which can occur when no target is presented due to slight soil moisture or soil temperature variations. Brightness temperature clutter due to a small surface roughness is also analytically modeled, through application of the small slope approximation for the homogeneous medium case and the small perturbation method in the presence of a subsurface layer, and it is shown that surface clutter effects can be mitigated through proper choice of sensor polarization and observation angle. Particular attention is given to the relationship between passive and active microwave sensors; results demonstrate that these two can provide complementary information. Finally, the use of wideband radiometric measurements are discussed as a means for reducing environmental clutter effects and improving detection algorithms.