There are a number of parameters used to characterize a measurement result for the purposes of specifying its value for the intended purpose. Precision (variability) and accuracy (correctness) are two of the more often used parameters and, like many other characterizing parameters, they often do not have the same meaning to the suppler, buyer, and ultimate user of dimensional metrology instruments used in semiconductor processing. These differences in meaning often arise because human "common sense" about measurements of ordinary-sized objects is often misleading when applied to submicrometer-sized objects. For example, there is no universal "ruler" that can used to measure the size of all submicrometer objects because dissimilarities between the ruler and the object will cause them to be "viewed" differently by light, electron beams, and mechanical probes. However, the basic concepts behind characterization of measurement results can be carried over from ordinary-sized objects to submicrometersized objects if the differences in the metrology applicable to these different dimensional regimes are taken into account. This paper summarizes the generally accepted generic metrological meaning(s) and significance of the more commonly used parameters to characterize measurement results for the purpose of clarifying any misunderstanding that might otherwise occur between the metrologist and the user of metrological data in the regime of submicrometer-sized features.
The mathematical background underlying the science of critical dimension metrology is presented at a level suitable for the semiconductor process engineer with little direct experience in the field. Concomitant with this purpose, we make few assumptions and derive many of the concepts from first principles. Understanding the fundamentals provides a basis for further learning and a chart for navigating the sometimes murky waters of the literature. The process engineer can use this work profitably on its own or as a companion document to industrial1 and international standards2.
During the manufacturing of present-day integrated circuits, certain measurements must be made of the submicrometer structures composing the device with a high degree of precision. Optical microscopy, scanning electron microscopy and the various forms of scanning probe microscopies are major microscopical techniques used for submicrometer metrology. New techniques applied to scanning electron microscopy have improved some of the limitations of this technique and time will permit even further improvements. This presentation will review the current state of scanning electron microscope (SEM) metrology in light of many of these recent improvements.
This paper presents a critical review of electrical test methods for determining feature placement with total measurement uncertainties below 10 nm and electrical linewidth for sub-half-micrometer design linewidths with measurement precision below 1 nm. Control of feature placement and control of linewidth have been and are expected to continue to be two of the most important challenges required in the manufacturing of advanced microelectronic devices. Traditional methods of measuring these parameters suffer from both measurement speed and equipment expense. Microelectronic test structures are electrical devices that are used to determine selected tool, process, device, material, or circuit parameters by means of electrical tests. They are supported by a variety of commercial test equipment often found in semiconductor manufacturing facilities. They provide low-cost, post-patterning metrology for determining both feature placement and electrical linewidth. Properly characterized test structures and measurement methods provide an economic means of determining the critical parameters needed to develop, control, and operate the next generation of patterning tools.
The measurement of critical dimensions of features on integrated circuits and photomasks is modeled as the comparison of the images of the test object and of a standard object in a measuring device. A length measuring instrument is then a comparator. The calibration of the standard and the conditions necessary for a valid comparison are discussed. The principles discussed here apply to many other types of measurement as well.
Equipment for pattern generation in the semiconductor industry have long been a critical part of the manufacturing process. As the industry matured and finer features were required on larger exposure fields, the complexity of these machines grew. The requirements of calibration soon taxed the ability of statistics to understand the complexities of and control required by the tools. In response, the industry adopted an analytical method of data analysis. Mathematical models describing the operation, design and behavior of these tools were developed. These models are called machine models and herein is contained a critical review of the technology to date. The development of machine models is shownfrom a historical standpoint. First the concepts of metrology and a clear explanation of the analytical method are presented. Then the development of modeling and how it was influenced by the automation of metrology in the mid 1980’s. The later sections describe models and their application to machine characterization and matching. These are followed by recent concepts in model building and evaluation. In doing so, it is shown how the concept of machine models are critical to the continued advancement of lithography and equipment development. A final section presents some previously unpublished work on the determination of machine precision. The work is supported by an appendix of derivations for the basic model elements.
Advanced semiconductor manufacturing processes require tight overlay registration tolerances. These strict overlay performance specifications dictate the wafer level overlay metrology performance required. Achieving a high level of performance from overlay metrology equipment requires attention to all aspects of the measurement process. A typical measurement system configuration is reviewed and elements of optical overlay measurement, as they relate to measurement uncertainty, are discussed in detail. Data analysis techniques, used to quantify tool induced measurement uncertainty, are demonstrated with supporting examples. Process and measurement target induced uncertainties are reviewed. Current developments in both target design and measurement algorithms are proposed to address these uncertainties. Measurement optimization and its role in process control applications and future developments in overlay processing are also discussed.
Particles, defects, and microcontaminaton: the bane of the IC process engineer! Controlling defects during every processing step of semiconductor devices is vital to successful manufacturing of modem chips. The requirements for tight defect control become increasingly severe with each new generation of semiconductors. For a typical 16 MB DRAM process, the total number of defects must be less than one for each 4 cm2 Gf Wafer surface area in order to achieve 70% yield on the wafer.1 Not only must the total number of defects on wafers decrease with each generation, the defect concentration per mask level must be reduced at an ever faster rate due to higher circuit complexity and increased number of mask levels. These defect reduction requirements are noted here for DRAMs, used as the technology driver, but must also be achieved in other device families such as ASICs and microprocessors.
Control of critical dimensions, overlay, and defects are required for lithography processes to be effective. Each area of concern requires appropriate metrology methods for measuring critical parameters. For measuring linewidths, there are optical, electrical, and scanning electron methods. The optimum operating point for the process must be determined for dimensional control, and individual process parameters need to be controlled as well. Similarly for overlay and defects, appropriate metrology methods must be established. Lithography is a situation in which conventional statistical process control techniques cannot be applied naively, and suitable statistical methods must be used.
The photolithography process consists of transferring a pattern optically from a chrome on quartz reticle onto a partially processed wafer that has been coated with photoresist. The image is transferred onto an underlying blanket of material after developing the resist by some sort of chemical and/or thermal step (etch, sinter, implant, growth, dope, etc.). This is repeated about 15 to 20 times, with each step seeing different pattern transfer steps, using different equipment. These steps may distort the wafer and its patterns, causing differences between wafers and lots. Despite this variance, each new step must precisely align to the previous layers. To accomplish this, each step of the process and every piece of equipment must be characterized. In this paper, the primary parameters and data analysis techniques that are most commonly tracked and used by photolithography engineers are reviewed.
The primary tools for metrology in semiconductor manufacturing are optical and electron microscopes. In recent years the market share of optical linewidth measurement tools have declined relative to scanning electron microscopes. However, optics remains the method of choice for measuring overlay registration, linewidths on photomasks, linewidths for thin film magnetic heads, and linewidths for larger structures. Scanning electron microscopes are dominant in the submicron and especially the sub 0.7 micron regime for linewidth measurement.
This paper will review the field of mathematical modeling for metrology. It will consider light optical and electron optical microscopes. The physics and mathematics for these two types are very different. For light optics the two primary mathematical modeling techniques are eigenmode expansion methods which apply only to structures with a linear symmetry like lines and finite element methods which may be applied to general structures. For electron microscopes the principal technique is Monte Carlo trajectory analysis.
For light optical microscopes, models have been applied to reflective and transmitting systems, classical brightfield, confocal, and coherence probe instruments. For scanning electron microscopes, most of the focus has been on secondary electron detector systems and more recently backscattered electron systems.
This paper will also review applications of these models to the problem of acuracy and linearity. And will touch on the complex issue of the inverse scattering problem.
Data collection is implicit in experimentation on and control of semiconductor manufacturing processes, but is often ignored for its effect on the outcome of testing. Improper sampling can increase the cost of testing, either by testing too much, or by increasing the risk of reaching a wrong conclusion; the costs can be significantly more than the cost of measurement, in some cases approaching several million dollars. This paper reviews methods and patterns of sampling. Issues associated with the nested process characteristics of semiconductor manufacturing are discussed, with specific attention to typical distributions. Sample size effects and recommendations are reviewed. The cost of uncertainty associated with sampling is examined. The paper also includes definitions of conunonly-used statistics and tests.
Pareto analysis has become one of the most important and widely used tools in solving quality problems. The usefulness of Pareto analysis extends well beyond the application to defect data being applicable to most every department in a company including marketing, production planning, purchasing, sales, maintenance, personnel, accounting, etc. A proper implementation of this tool requires an understanding of the techniques that have been developed and knowing its limitations. In the microelectronics industry processing and analysis of defect data relies heavily on Pareto analysis. With inline defect scanning tools becoming faster and more sensitive, a program to monitor inline defectivity on a regular basis is justified. Correlation of this data with end-of-line electrical failure analysis provides the most accurate Pareto of the most important "killer" defect types.