Small target detection and tracking are important for laser radars in many applications such as laser weapons, directed infrared countermeasures (DIRCM), fire control, target recognition, and free-space laser communications. The detection and tracking performance depends on the mode of detection, SNR, target signal statistics, beam jitter, and turbulence induced irradiance variations. We show results of the root mean square (rms) tracking error versus SNR primarily for direct detection systems using a strong glint return. For the general case of a certain signal and noise probability density functions (pdf) it is hard to obtain analytical solutions for the mean and variances of the estimates for the rms tracking error. We have therefore used numerical simulations to illustrate how the pdf and SNR will affect the tracking accuracy. Gamma functions and other pdf's can be used to characterize the signal distributions to get a first hand on tracking performance. The results are presented as tracking errors versus the angular spot size of the laser beam in the tracking detector plane. We also investigate the beam optimization problem for target detection and "power in bucket," which is maximizing the laser energy at the target. We find that there are optimum beam sizes (w) versus the rms jitter (σ) and that optimum w/σ (minimizing the false alarm rate for a given detection probability Pd) typically fall in regions 1 to 3 depending on probability of detection and the representative pdf for the application in mind.