Numerical simulation of overlay metrology targets has become a de-facto standard in advanced technology nodes. While appropriate simulation software is widely available in the industry alongside with metrics that allow selection of the best performing targets, the model validation tools are less developed. We present an approach of numerical model validation based on the comparison between target simulation results and on-product overlay measurements. A “simulation-tomeasurement” software is used in this work to compare the performance metrics and accuracy flags of scatterometrybased overlay targets designed using KLA-Tencor AcuRate™ simulator for the critical layers of 12nm FD-SOI FEOL stack and 22nm FD-SOI BEOL stack. We demonstrate how simulation-to-measurement matching enabled us to verify the model, identify discrepancies between the model and the product stack and build an improved model that correctly describes the target. The refined target stack was used for image-based overlay target simulations that allowed us to obtain better performing optical overlay targets as well.
After critical lithography steps, overlay and CD are measured to determine if the wafers need to be re-worked. Traditionally, overlay metrics are applied per X/Y-direction and, a CD metric is computed independently. From design standpoint, electrical failure is based on a complex interaction between CD deviations and overlay errors. We propose a method including design constraints, where results of different measurement steps are not judged individually, but in a combined way. We illustrate this with a critical design feature consisting of a contact requiring minimum distance to a neighboring metal line, resulting in much better correlation to yield than traditional methods.
In the current paper we are addressing three questions relevant for accuracy: 1. Which target design has the best performance and depicts the behavior of the actual device? 2. Which metrology signal characteristics could help to distinguish between the target asymmetry related overlay shift and the real process related shift? 3. How does uncompensated asymmetry of the reference layer target, generated during after-litho processes, affect the propagation of overlay error through different layers? We are presenting the correlation between simulation data based on the optical properties of the measured stack and KLA-Tencor’s Archer overlay measurements on a 28nm product through several critical layers for those accuracy aspects.
According to the ITRS roadmap , the overlay requirement for the 28nm node is 8nm. If we compare this number with the performance given by tool vendors for their most advanced immersion systems (which is < 3nm), there seems to remain a large margin. Does that mean that today’s leading edge Fab has an easy life? Unfortunately not, as other contributors affecting overlay are emerging. Mask contributions and so-called non-linear wafer distortions are known effects that can impact overlay quite significantly. Furthermore, it is often forgotten that downstream (post-litho) processes can impact the overlay as well. Thus, it can be required to compensate for the effects of subsequent processes already at the lithography operation. Within our paper, we will briefly touch on the wafer distortion topic and discuss the limitations of lithography compensation techniques such as higher order corrections versus solving the root cause of the distortions. The primary focus will be on the impact of the etch processes on the pattern placement error. We will show how individual layers can get affected differently by showing typical wafer signatures. However, in contrast to the above-mentioned wafer distortion topic, lithographic compensation techniques can be highly effective to reduce the placement error significantly towards acceptable levels (see Figure 1). Finally we will discuss the overall overlay budget for a 28nm contact to gate case by taking the impact of the individual process contributors into account.
Overlay specifications are tightening with each lithography technology node. As a result, there is a need to improve
overlay control methodologies to make them more robust and less time- or effort-consuming, but without any
compromise in quality. Two concepts aimed at improving the creation of scanner grid recipes in order to meet evertightening
overlay specifications are proposed in this article. Simulations will prove that these concepts can achieve both
goals, namely improving overlay control performance and reducing the time and effort required to do so. While more
studies are needed to fine-tune the parameters to employ, the trends presented in this paper clearly show the benefits.
The Critical Dimension Scanning Electron Microscope (CDSEM) is the traditional workhorse solution for inline process control. Measurements are extracted from top-down images based on secondary electron collection while scanning the specimen. Secondary electrons holding majority of detection yield. These images provide more on the structural information of the specimen surface and less in terms of material contrast. In some cases there is too much structural information in the image which can irritate the measurement, in other cases small but important differences between various material compounds cannot be detected as images are limited by contrast information and resolution of primary scanning beam. Furthermore, accuracy in secondary electron based metrology is limited by charging. To gather the exact required information for certain material compound as needed, a technique, known from material analytic SEM´s has been introduced for inline CDSEM analysis and process control: Low Loss Back Scattered Electron Imaging (LL-BSE). The key at LL-BSE imaging is the collection of only the back scattered electrons (BSE) from outermost specimen surface which undergo the least amount possible of energy loss in the process of image generation following impact of the material by a primary beam. In LL-BSE very good and measurable material distinction and sensitivity, even for very low density material compounds can be achieved. This paper presents new methods for faster process development cycle, at reduced cost, based on LL-BSE mass data mining instead of sending wafers for destructive material analysis.
Ever shrinking measurement uncertainty requirements are difficult to achieve for a typical metrology
toolset, especially over the entire expected life of the fleet. Many times, acceptable performance can be
demonstrated during brief evaluation periods on a tool or two in the fleet. Over time and across the rest of
the fleet, the most demanding processes often have measurement uncertainty concerns that prevent optimal
process control, thereby limiting premium part yield, especially on the most aggressive technology nodes.
Current metrology statistical process control (SPC) monitoring techniques focus on maintaining the
performance of the fleet where toolset control chart limits are derived from a stable time period. These
tools are prevented from measuring product when a statistical deviation is detected. Lastly, these charts
are primarily concerned with daily fluctuations and do not consider the overall measurement uncertainty. It
is possible that the control charts implemented for a given toolset suggest a healthy fleet while many of
these demanding processes continue to suffer measurement uncertainty issues. This is especially true when
extendibility is expected in a given generation of toolset. With this said, there is a need to continually
improve the measurement uncertainty of the fleet until it can no longer meet the needed requirements at
which point new technology needs to be entertained. This paper explores new methods in analyzing
existing SPC monitor data to assess the measurement performance of the fleet and look for opportunities to
drive improvements. Long term monitor data from a fleet of overlay and scatterometry tools will be
analyzed. The paper also discusses using other methods besides SPC monitors to ensure the fleet stays
matched; a set of SPC monitors provides a good baseline of fleet stability but it cannot represent all
measurement scenarios happening in product recipes. The analyses presented deal with measurement
uncertainty on non-measurement altering metrology toolsets such as scatterometry, overlay, atomic force
microscopy (AFM) or thin film tools. The challenges associated with monitoring toolsets that damage the
sample such as the CD-SEMs will also be discussed. This paper also explores improving the monitoring
strategy through better sampling and monitor selection. The industry also needs to converge regarding the metrics used to describe the matching component of measurement uncertainty so that a unified approach is
reached regarding how to best drive the much needed improvements. In conclusion, there will be a
discussion on automating these new methods3,4 so they can complement the existing methods to provide a
better method and system for controlling and driving matching improvements in the fleet.
The volume of measurements and the complexity of metrology recipes in state-of-the-art semiconductor manufacturing
have made the conventional manual process of creating the recipes increasingly problematic. To address these
challenges, we implemented a system for automatically creating production metrology recipes. We present results from
the use of this system for CD-SEM and overlay tools in a high-volume manufacturing environment and show that, in
addition to the benefits of reduced engineering time and improved tool utilization, recipes produced by the automated
system are in many respects more robust than the equivalent manually created recipes.
Overlay process control up to and including the 45nm node has been implemented using a small number of large
measurement targets placed in the scribe lines surrounding each field. There is increasing concern that this scheme does
not provide sufficiently accurate information about the variation of overlay within the product area of the device.
These concerns have led to the development of new, smaller targets designed for inclusion within the device area of real
products [1,2]. The targets can be as small as 1-3μm on a side, which is small enough to permit their inclusion inside the
device pattern of many products. They are measured using a standard optical overlay tool, and then calibrated. However,
there is a tradeoff between total measurement uncertainty (TMU) and target size reduction . Also the calibration
scheme applied impacts TMU.
We report results from measurements of 3μm targets on 45nm production wafers at both develop and etch stages. An
advantage of these small targets is that at the etch stage they can readily be measured using a SEM, which provides a
method for verifying the accuracy of the measurements.
We show how the 3μm in-chip targets can be used to obtain detailed information for in-device overlay variability and to
maintain overlay control in successive process generations.
Today's semiconductors consist of up to forty structured layers which make up the electric circuit. Since the
market demands more powerful chips at minimal cost, the structure size is decreased with every technology
node. The smaller the features become, the more sensitive is the functional effciency of the chip with respect to
placement errors. One crucial component for placement errors is the mask which can be viewed as a blueprint of
the layer's structures. Hence, placement accuracy requirements for masks are also tightening rapidly. These days,
mask shops strive for improving their positioning performance. However, more and more effort is required which
will increase the costs for masks. Therefore, the transfer of mask placement errors onto the wafer is analyzed in
order to check the guidelines which are used for deriving placement error specifications.
In the first section of this paper the basic concepts for measuring placement errors are provided. Then, a method
is proposed which is able to characterize the transfer of placement errors from mask to wafer. This is followed
by two sections giving a thorough statistical analysis of this method. In the fifth section, the connection to
placement accuracy specifications on mask and wafer is established. Finally, the method is applied to a set of
test masks provided by AMTC and printed by AMD.
As a consequence of the shrinking sizes of the integrated circuit structures, the overlay budget shrinks as well. Overlay is
traditionally measured with relatively large test structures which are located in the scribe line of the exposure field, in the
four corners. Although the performance of the overlay metrology tools has improved significantly over time it is
questionable if this traditional method of overlay control will be sufficient for future technology nodes. For advanced
lithography techniques like double exposure or double patterning, in-die overlay is critical and it is important to know
how much of the total overlay budget is consumed by in-die components.
We reported earlier that small overlay targets were included directly inside die areas and good performance was
achieved. This new methodology enables a wide range of investigations. This provides insight into processes which
were less important in the past or not accessible for metrology. The present work provides actual data from productive
designs, instead of estimates, illustrating the differences between the scribe line and in-die registration and overlay.
The influence of the pellicle on pattern placement on mask and wafer overlay is studied. Furthermore the registration
overlay error of the reticles is correlated to wafer overlay residuals.
The influence of scanner-induced distortions (tool to tool differences) on in-die overlay is shown.
Finally, the individual contributors to in-die-overlay are discussed in the context of other overlay contributors. It is
proposed to use in-die overlay and registration results to derive guidelines for future overlay and registration
specifications. It will be shown that new overlay correction schemes which take advantage of the additional in-die
overlay information need to be considered for production.
Design Based Metrology (DBM) implements a novel automation flow, which allows for a direct
and traceable correspondence to be established between selected locations in product designs and
matching metrology locations on silicon wafers. Thus DBM constitutes the fundamental enabler of
Design For Manufacturability (DFM), because of its intrinsic ability to characterize and quantify the
discrepancy between design layout intent and actual patterns on silicon. The evolution of the CDSEM
into a DFM tool, capable of measuring thousands of unique sites, includes 3 essential
functionalities: (1) seamless integration with design layout and locations coordinate system; (2) new
design-based pattern recognition and (3) fully automated recipe generation. Additionally advanced
SEM metrology algorithms are required for complex 2-dimensional features, Line-Edge-Roughness
(LER), etc. In this paper, we consider the overall DBM flow, its integration with traditional CDSEM
metrology and the state-of-the-art in recipe automation success. We also investigate advanced
DFM applications, specifically enabled by DBM, particularly for OPC model calibration and
verification, design-driven RET development and parametric Design Rule evaluation and selection.
In this publication, the contributors to in-field overlay metrology uncertainty have been parsed and quantified on a back
end process and compared with results from a previous front end study<sup>1</sup>. Particular focus is placed on the unmodeled
systematics, i.e. the components which contribute to residuals in a linear model after removal of random errors. These
are the contributors which are often the most challenging to quantify and are suspected to be significant in the model
residuals. The results show that in both back and front end processes, the unmodeled systematics are the dominant
residual contributor, accounting for 60 to 70% of the variance, even when subsequent exposures are on the same
scanner. A higher order overlay model analysis demonstrates that this element of the residuals can be further dissected
into correctible and non-correctible high order systematics. A preliminary sampling analysis demonstrates a major
opportunity to improve the accuracy of lot dispositioning parameters by transitioning to denser sample plans compared
with standard practices. Field stability is defined as a metric to quantify the field to field variability of the intrafield
In this publication, the contributors to in-field overlay metrology uncertainty have been parsed and quantified in a specific case study. Particular focus is placed on the unmodeled systematics, i.e. the components which contribute to residuals in a linear model after removal of random errors. These are the contributors which are often the most challenging to quantify and are suspected to be significant in the model residuals. The results show that even in a relatively "clean" front end process, the unmodeled systematics are the dominant residual contributor, accounting for 60 to 70% of the variance. Given the above results, new sampling and modeling methods are proposed which have the potential to improve the accuracy of modeled correctibles and lot dispositioning parameters.
In this paper we investigate the impact of bake plate temperature variability throughout the entire bake trajectory on resulting critical dimension. For a poorly-controlled bake plate, it is found that the correlation between the temperature profile and CD distribution is high throughout the entire bake cycle, including the steady state sector. However, for a well-controlled, multiple-zone bake plate, the correlation is only significant during the transient heating sector, since in those cases the steady state plate behavior has already been optimized for CDU performance. An estimate of the potential improvement yet to be gained by improvement of transient heating uniformity is calculated.
Overlay specifications are getting tighter and lithographic processes come close to their limits. Minimal process changes can lead occasionally to overlay excursions. We explore the use of advanced query and multivariate analysis techniques to address overlay issues in an advanced production environment. We demonstrate the use of advanced query and multivariate analysis techniques in 4 case studies: identifying problem overlay recipes, comparing sources of variation in backend processing, identifying lithography tool issues, and overlay tool monitoring. Due to the large number of possible filter combinations several simple queries were used as starting points in order to explore the existing overlay database in a systematic way. The goal of the systematic evaluation of the available information was to find the most efficient methods to analyze and identify specific overlay problems. During this screening process, device, layer, and exposure tool specific metrics were found. For the most important findings the data filtering was refined in a second stage. Additional sources of information were incorporated for verification and to make correct conclusions. Standardized sets of queries can be used to monitor the lithographic process or to quickly pin point the root causes. It is shown that one can efficiently identify process, tool, and metrology sources of variation.
As overlay budgets shrink with design rules, the importance of overlay metrology accuracy increases. We have investigated the overlay accuracy of a 0.18mm design rule Copper-Dual-Damascene process by comparing the overlay metrology results at the After Develop (DI) and After Etch (FI) stages. The comparisons were done on five process layers on production wafers, while ensuring that the DI and FI measurements were always done on the same wafer. In addition, we measured the in-die overlay on one of the process layers (Poly Gate) using a CD-SEM, and compared the results to the optical overlay metrology in the scribe-line. We found that a serious limitation to in-die overlay calibration was the lack of suitable structures measurable by CD-SEM. We will present quantitative results from our comparisons, as well as a recommendation for incorporating CD-SEM-measurable structures in the chip area in future reticle designs.
According to the SIA roadmap an overlay of 65nm is necessary for state of the art 0.18micrometers processes. To meet such tight requirements it is necessary to know the magnitude of all contributions, to understand possible interactions and to try to drive every individual overlay component to its ultimate lower limit. In this experimental study we evaluate the impact of different contributions on the overall overlay performance in a fab equipped exclusively with ASML step and scan systems. First we discuss the overlay performance of advanced step and scan systems in a mix and match scenario, focusing on single machine overlay, long term stability and multiple machine matching. We show that both distortion and stage differences between different tools are typically less than 22nm, justifying a multiple machine scenario without significant loss of overlay performance. In the next step, we discuss the impact of layer deposition and CMP. We include shallow trench isolation, tungsten-CMP as well as conventional aluminum wiring and copper-dual-damascene technology into our examinations. In particular, we discuss the pro's and con's of using a zero-layer-mark-approach, compared to an alignment on marks formed in certain layers for direct layer to layer alignment. Furthermore, we examine the performance of ASMLs 'through-the-lens' (TTL)-alignment system becomes as small as 6nm using TTL-alignment. For marks directly affected by CMP-processes technology impact can be controlled within 13nm. We show that, even in a scenario with multiple tools matched to each other, where alignment marks are directly affected by a CMP process step, and where the standard TTL alignment marks are directly affected by a CMP process step, and where the standard TTL alignment system is used, the overall overlay can be controlled within 60nm. Using the ATHENA alignment system, a further improvement is possible.
Currently, most production fabs use critical dimension (CD) measurements as their primary means for process control in printing lines, spaces and contacts. Historically, this has been adequate to control the lithography and etch processes and produce reasonable yields. However, as the industry moves from 0.25 micrometer manufacturing to 0.18 micrometer and beyond, it is becoming increasingly obvious that CD measurements alone do not provide enough information about the printed structures. As the geometry shrinks, slight changes in the shape and profile can significantly affect the electrical characteristics of the circuit while maintaining the same CD value. In this paper, we will describe a method which, in conjunction with the CD measurements, better characterizes the circuit structures and therefore provides valuable feedback about the process. This method compares stored image and linescan information of a 'golden' (correctly processed) structure to that of the structure being measured. Based on the collected data, it is possible to distinguish between different profiles and determine if a process shift has occurred, even when the measured CD remains within specification. The correlation score therefore provides an additional constraint that better defines the true process window and provides an additional flag for process problems. Without this information, the process used may not be truly optimized, or a shift may occur that is not detected in a timely manner, resulting in the loss of yield and revenue. This data collection has been implemented in production on a local interconnect lithography process. Before the correlation information was available, it was very difficult to detect the scumming within the LI trench, in which it was a time consuming and labor intensive procedure to identify problem lots. The correlation scores, collected automatically and concurrently with the CD measurement, allowed tracking through the SPC chart and the automatic flagging of problems while the lot was still in the photolithography module. The result has been faster feedback control and thus less scrap material.