So, an optimized security camera has been designed. Now, can it be built? If the as-built performance is expected to be as good as the nominal design performance, the lens has to be made to the optimized prescription given in CODE V. But how precisely must the lens be made? When you hover over a cell in the LDM, you will see that lens parameter is expressed to 13 places beyond the decimal point. Although not all of this precision is needed to specify the lens parameter, the lens parameter must be made (see the Third Hiatus) to an accuracy or tolerance that will be good enough to meet the performance specifications. Which provokes a second question: how do you determine what tolerances are "good enough?" The answer can be provided by a process called tolerancing.
It is relatively easy to simulate the effect of one specific lens error by making a change to the nominal design for that lens parameter and examining its effect on lens performance. However, it is much more difficult to think about how multiple lens errors, all occurring at the same time (with random values within their tolerance ranges), will affect the final lens performance. For example, consider a lens system with both a radius error and a thickness error. The performance loss from the radius error might simply add to the loss from the thickness error or the two errors could even cancel each other out (resulting in a lens with the nominal design performance). Usually, the impact of two tolerance errors is somewhere in between these extremes. How then can we have any confidence that we can make any lens assembly (with a large number of tolerances) with acceptable performance? For example, how carefully must we make a lens to meet a specified MTF value across all fields? This is where CODE V and its statistical tolerancing routines can be used to model the manufacturing processes and, based upon the capability of the optical shop, predict the probability of producing a set of lenses with the desired performance (at a desired cost).