The effect of focus anisoplanatism upon the performance of an astronomical laser guide star (LGS) adaptive optics (AO) system can in principle be reduced if the lowest order wavefront aberrations are sensed and corrected using a natural guide star (NGS). For this approach to be useful, the noise performance of the wavefront sensor (WFS) used for the NGS measurements must be optimized to enable operation with the dimmest possible source. Two candidate sensors for this application are the Shack-Hartmann sensor and “phase-diverse phase retrieval,” a comparatively novel approach in which the phase distortion is estimated from two or more well-sampled, full-aperture images of the NGS measured with known adjustments applied to the phase profile. We present analysis and simulation results on the noise-limited performance of these two methods for a sample LGS AO observing scenario. The common parameters for this comparison are the NGS signal level, the sensing wavelength, the second-order statistics of the phase distortion, and the RMS detector read noise. Free parameters for the two approaches are the Shack-Hartmann subaperture geometry, the focus biases used for the phase-diversity measurements, and the algorithms used to estimate the wavefront. We find that phase-diverse phase retrieval provides consistently superior wavefront estimation accuracy when the NGS signal level is high. For lower NGS signal levels on the order of 103 photodetection events, the Shack-Hartmann (phase diversity) approach is preferred at a RMS detector read noise level of 5 (0) electrons/pixel.