Edge detection is defined as the process of detecting and representing the presence of and locations of image signal discontinuities, which serves as the basic transformation of signals into symbols and it influences the performance of subsequent processing. In general, the edge detection operation has two main steps: filtering, and detection and localization. In the first step, finding an optimal scale of the filter is an ill-posed problem, especially when a single—global—scale is used over the entire image. Multi-resolution description of the image which can fully represent the image features occurring in a range of scales is used, where a combination of Gaussian filters with different scales can ameliorate the single scale issue. In the second step, often edge detectors have been designed to capture simple ideal step functions in image data, but real image signal discontinuities deviate from this ideal form. Another three types of deviations from the step function which relate to real distortions occurring in natural images are examined. These types are impulse, ramp, and sigmoid functions which respectively represent narrow line signals, simplified blur effects, and more accurate blur modeling. General rules for edge detection based upon the classification of edge types into four categories-ramp, impulse, step, and sigmoid are developed from this analysis. The performance analysis on experiments supports that the proposed multi-resolution edge detection algorithm with edge pattern analysis does lead to more effective edge detection and localization with improved accuracies.