This paper describes a detailed design analysis that has been performed in order to identify the optimum design parameter regions for 0.18 micrometer NMOSFET devices for low power applications. The objective of the analysis was to obtain the highest possible drive current while satisfying a target off- state leakage criterion and a short-channel constraint for a supply voltage of 0.9 V. In the analysis, a set of basic structural and doping profile parameters was utilized in light of the anticipated future trends for this technology generation. In order to control adverse short-channel characteristics for these very short Leff devices and achieve the largest possible drive current, channel profile engineering was required. Two different channel engineering options, a boron halo implant and a boron anti-punchthrough (APT) implant were investigated. The peak doping (dose) and peak depth (energy) of these implants were varied in order to analyze the effects on device performance. For each selected halo or APT dose and energy, the saturation drain current, IDsat, and the change in threshold voltage, (Delta) VT, due to drain-induced-barrier-lowering (DIBL) were monitored. A design matrix was generated for devices at each of a number of different shallow S/D junction depths, showing both the IDsat and (Delta) VT (DIBL) values plotted against the halo or APT implant dose and energy. These design matrices provide an understanding of the acceptable regions of device operation for different profile conditions. Finally, a comparison of the use of a halo implant versus an APT implant is discussed for this low-power technology.