Accurate calculations of edge locations and shapes are often important in industrial machine vision applications. The accuracy of the calculations is affected by the lighting, the camera and other vision system hardware, and the choice of vision algorithms. In this paper we examine the first step in the vision analysis: the formation of an edge's image on the camera's sensor. In particular, we describe how the lighting and the camera's lens affect the location, shape, contrast, and intensity of the "edge" that the camera sees. An edge is assumed to be an abrupt, step change in an object's reflectivity, opacity, color, or other visually measurable property. The image of the edge on the camera's sensor can be modelled as a perfect image that has been modified by the lighting and the camera optics. We examine the lighting and lens effects that can cause the image on the camera's sensor to differ from an ideal edge. Those that produce measurable differences in typical industrial vision applications are described and quantified. The result is an analytical model for the resulting image on the camera's sensor. We use the predictions from this model to estimate the error contribution in experimental images.