A new quality metric for evaluating edges detected by digital image processing algorithms is presented. The metric is a weighted sum of measures of edge continuity, smoothness, thinness, localization, detection, and noisiness. Through a training process, we can design weights which optimize the metric for different users and applications. We have used the metric to compare the results of ten edge detectors when applied to edges degraded by varying degrees of blur and varying degrees and types of noise. As expected, the more optimum Difference-of-Gaussians (DOG) and Haralick methods outperform the simpler gradient detectors. At high signal-to-noise (SNR) ratios, Haralick's method is the best choice, although it exhibits a sudden drop in performance at lower SNRs. The DOG filter's performance degrades almost linearly with SNR, and maintains a reasonably high level at lower SNRs. The same relative performances are observed as blur is varied. For most of the detectors tested, performance drops with increasing noise correlation. Noise correlated in the same direction as the edge is the most destructive of the noise types tested.