Shrinking on-product overlay (OPO) budgets in advanced technology nodes require more accurate overlay measurement and better measurement robustness to process variability. Pupil-based accuracy flags have been introduced to the scatterometry-based overlay (SCOL) system to evaluate the performance of a SCOL measurement setup. Wavelength Homing is a new robustness feature enabled by the continuous tunability of advanced SCOL systems using a supercontinuum laser light source in combination with a flexible bandpass filter. Inline process monitoring using accuracy flags allows for detection, quantification and correction of shifts in the optimal measurement wavelength. This work demonstrates the benefit of Wavelength Homing in overcoming overlay inaccuracy caused by process changes and restoring the OPO and residual levels in the original recipe.
As semiconductor technology nodes keep shrinking, ever-tightening on-product overlay (OPO) budgets coupled with continuous process development and improvement make it critical to have a robust and accurate metrology setup. Process monitoring and control is becoming increasingly important to achieve high yield production. In recently introduced advanced overlay (OVL) systems, a supercontinuum laser source is applied to facilitate the collection of overlay spectra to increase measurement stability. In this paper, an analysis methodology has been proposed to couple the measured overlay spectra with overlay simulation to extract exact process information from overlay spectra. This paper demonstrates the ability to use overlay spectra to capture and quantify process variation, which in turn can be used to calibrate the simulation stacks used to create the SCOL (scatterometry-based overlay) and AIM overlay metrology targets, and can be fed into the fab for process monitoring and improvement.
We demonstrate high volume manufacturing feasibility of 7 nm technology overlay correction requirement. This stateof- the-art overlay control is achieved by (i) overlay sampling optimization and advanced modeling, (ii) alignment and advanced process control optimization, (iii) multiple target overlay optimization, and (iv) heating control. We will also discuss further improvements in overlay control for 7 nm technology node and beyond including computational metrology, extreme ultraviolet and optic tools overlay matching control, high order alignment correction, tool stability improvement, and advanced heating control.
Overlay metrology setup today faces a continuously changing landscape of process steps. During Diffraction Based Overlay (DBO) metrology setup, many different metrology target designs are evaluated in order to cover the full process window. The standard method for overlay metrology setup consists of single-wafer optimization in which the performance of all available metrology targets is evaluated. Without the availability of external reference data or multiwafer measurements it is hard to predict the metrology accuracy and robustness against process variations which naturally occur from wafer-to-wafer and lot-to-lot. In this paper, the capabilities of the Holistic Metrology Qualification (HMQ) setup flow are outlined, in particular with respect to overlay metrology accuracy and process robustness. The significance of robustness and its impact on overlay measurements is discussed using multiple examples. Measurement differences caused by slight stack variations across the target area, called grating imbalance, are shown to cause significant errors in the overlay calculation in case the recipe and target have not been selected properly. To this point, an overlay sensitivity check on perturbations of the measurement stack is presented for improvement of the overlay metrology setup flow. An extensive analysis on Key Performance Indicators (KPIs) from HMQ recipe optimization is performed on µDBO measurements of product wafers. The key parameters describing the sensitivity to perturbations of the measurement stack are based on an intra-target analysis. Using advanced image analysis, which is only possible for image plane detection of μDBO instead of pupil plane detection of DBO, the process robustness performance of a recipe can be determined. Intra-target analysis can be applied for a wide range of applications, independent of layers and devices.
We demonstrate a novel method to establish a root cause for an overlay excursion using optical Scatterometry metrology. Scatterometry overlay metrology consists of four cells (two per directions) of grating on grating structures that are illuminated with a laser and diffracted orders measured in the pupil plane within a certain range of aperture. State of art algorithms permit, with symmetric considerations over the targets, to extract the overlay between the two gratings. We exploit the optical properties of the target to extract further information from the measured pupil images, particularly information that maybe related to any change in the process that may lead to an overlay excursion. Root Cause Analysis or RCA is being developed to identify different kinds of process variations (either within the wafer, or between different wafers) that may indicate overlay excursions. In this manuscript, we demonstrate a collaboration between Globalfoundries and KLA-Tencor to identify a symmetric process variation using scatterometry overlay metrology and RCA technique.
In recent years overlay (OVL) control schemes have become more complicated in order to meet the ever shrinking margins of advanced technology nodes. As a result, this brings up new challenges to be addressed for effective run-to- run OVL control. This work addresses two of these challenges by new advanced analysis techniques: (1) sampling optimization for run-to-run control and (2) bias-variance tradeoff in modeling. The first challenge in a high order OVL control strategy is to optimize the number of measurements and the locations on the wafer, so that the “sample plan” of measurements provides high quality information about the OVL signature on the wafer with acceptable metrology throughput. We solve this tradeoff between accuracy and throughput by using a smart sampling scheme which utilizes various design-based and data-based metrics to increase model accuracy and reduce model uncertainty while avoiding wafer to wafer and within wafer measurement noise caused by metrology, scanner or process. This sort of sampling scheme, combined with an advanced field by field extrapolated modeling algorithm helps to maximize model stability and minimize on product overlay (OPO). Second, the use of higher order overlay models means more degrees of freedom, which enables increased capability to correct for complicated overlay signatures, but also increases sensitivity to process or metrology induced noise. This is also known as the bias-variance trade-off. A high order model that minimizes the bias between the modeled and raw overlay signature on a single wafer will also have a higher variation from wafer to wafer or lot to lot, that is unless an advanced modeling approach is used. In this paper, we characterize the bias-variance trade off to find the optimal scheme. The sampling and modeling solutions proposed in this study are validated by advanced process control (APC) simulations to estimate run-to-run performance, lot-to-lot and wafer-to- wafer model term monitoring to estimate stability and ultimately high volume manufacturing tests to monitor OPO by densely measured OVL data.
With the introduction of N2x and N1x process nodes, leading-edge factories are facing challenging demands of shrinking design margins. Previously un-corrected high-order signatures, and un-compensated temporal changes of high-order signatures, carry an important potential for improvement of on-product overlay (OPO). Until recently, static corrections per exposure (CPE), applied separately from the main APC correction, have been the industry’s standard for critical layers , . This static correction is setup once per device and layer and then updated periodically or when a machine change point generates a new overlay signature. This is a non-ideal setup for two reasons. First, any drift or sudden shift in tool signature between two CPE update periods can cause worse OPO and a higher rework rate, or, even worse, lead to yield loss at end of line. Second, these corrections are made from full map measurements that can be in excess of 1,000 measurements per wafer .
Advanced overlay control algorithms utilizing Run-to-Run (R2R) CPE can be used to reduce the overlay signatures on product in High Volume Manufacturing (HVM) environments. In this paper, we demonstrate the results of a R2R CPE control scheme in HVM. The authors show an improvement up to 20% OPO Mean+3Sigma values on several critical immersion layers at the 28nm and 14 nm technology nodes, and a reduction of out-of-spec residual points per wafer (validated on full map). These results are attained by closely tracking process tool signature changes by means of APC, and with an affordable metrology load which is significantly smaller than full wafer measurements.
As photolithography will continue with 193nm immersion multiple patterning technologies for the leading edge HVM process node, the production overlay requirement for critical layers in logic devices has almost reached the scanner hardware performance limit. To meet the extreme overlay requirements in HVM production environment, this study investigates a new integrated overlay control concept for leading edge technology nodes that combines the run-to-run (R2R) linear or high order control loop, the periodic field-by-field or correction per exposure (CPE) wafer process signature control loop, and the scanner baseline control loop into a single integrated overlay control path through the fab host APC system. The goal is to meet the fab requirements for overlay performance, lower the cost of ownership, and provide freedom of control methodology. In this paper, a detailed implementation of this concept will be discussed, along with some preliminary results.
As leading edge lithography moves to advanced nodes in high-mix, high-volume manufacturing environment, automated control of critical dimension (CD) within wafer has become a requirement. Current control methods to improve CD uniformity (CDU) generally rely upon the use of field by field exposure corrections via factory automation or through scanner sub-recipe. Such CDU control methods are limited to lithography step and cannot be extended to etch step. In this paper, a new method to improve CDU at post etch step by optimizing exposure at lithography step is introduced. This new solution utilizes GLOBALFOUNDRIES’ factory automation system and KLA-Tencor’s K-T Analyzer as the infrastructure to calculate and feed the necessary field by field level exposure corrections back to scanner, so as to achieve the optimal CDU at post etch step. CD at post lithography and post etch steps are measured by scatterometry metrology tools respectively and are used by K-T Analyzer as the input for correction calculations. This paper will explain in detail the philosophy as well as the methodology behind this novel CDU control solution. In addition, applications and use cases will be reviewed to demonstrate the capability and potential of this solution. The feasibility of adopting this solution in high-mix, high-volume manufacturing environment will be discussed as well.
As leading edge lithography moves to advanced nodes which requires better critical dimension (CD) control ability within wafer. Current methods generally make exposure corrections by field via factory automation or by sub-recipe to improve CD uniformity. KLA-Tencor has developed a method to provide CD uniformity (CDU) control using a generated Focus/Exposure (F/E) model from a representative process. Exposure corrections by each field can be applied back to the scanner so as to improve CD uniformity through the factory automation. CDU improvement can be observed either at after lithography or after etch metrology steps. In addition to corrections, the graphic K-T Analyzer interface also facilitates the focus/exposure monitoring at the extreme wafer edge. This paper will explain the KT CDFE method and the application in production environment. Run to run focus/exposure monitoring will be carried out both on monitoring and production wafers to control the wafer process and/or scanner fleet. CDU improvement opportunities will be considered as well.
As the overlay performance and accuracy requirements become tighter, the impact of process parameters on the target
signal becomes more significant. Traditionally, in order to choose the optimum overlay target, several candidates are
placed in the kerf area. The candidate targets are tested under different process conditions, before the target to be used in
mass production is selected. The varieties of targets are left on the mass production mask and although they will not be
used for overlay measurements they still consume kerf real estate. To improve the efficiency of the process we are
proposing the KTD (KLA-Tencor Target Designer). It is an easy to use system that enables the user to select the
optimum target based on advanced signal simulation. Implementing the KTD in production is expected to save 30% of
kerf real estate due to more efficient target design process as well as reduced engineering time.
In this work we demonstrate the capability of the KTD to simulate the Archer signal in the context of advanced
DRAM processes. For several stacks we are comparing simulated target signals with the Archer100 signals. We
demonstrate the robustness feature in the KTD application that enables the user to test the target sensitivity to process
changes. The results indicate the benefit of using KTD in the target optimization process.
With decreasing pattern sizes the absolute size of acceptable pattern deviations decreases. For mask-makers a
new technology requires a review, which mask design variations print on the wafer under production illumination
conditions and whether these variations can be found reliably (100%) with the current inspection tools. As
defect dispositioning is performed with an AIMS-tool, the critical AIMS values, above which a defect prints
lithographically significant on the wafer, needs to be determined. In this paper we present a detailed sensitivity
analysis for programmed defects on 2 different KLA 5xx tools employing the pixel P90 at various sensitivity
settings in die-to-die transmitted mode. Comparing the inspection results with the wafer prints of the mask
under disar illumination it could be shown that all critical design variations are reliably detected using a state-of-the-art tool setup. Furthermore, AIMS measurements on defects with increasing defect area of various defect
categories were taken under the same illumination conditions as for the wafer prints. The measurements were
evaluated in terms of AIMS intensity variation (AIV). It could be shown that the AIMS results exhibit a linear
behavior if plotted against the square-root area (SRA) of the defects on the mask as obtained from mask SEM
images. A consistent lower AIV value was derived for all defect categories.
This paper presents first results of a defect printability study for the 70nm and 90nm technology. Two 6% halftone test masks with dense line/space (l/s) and contact hole (CH) structures, containing programmed defects were exposed at different production illumination conditions. The resultant data was compared with respect to the mask defect sizes, the Aerial Image Measurement System (AIMS) values and the mask defect inspection sensitivity. As expected over-and under-sized features exhibity the highest printability and AIMS value intensity deviation. No difference was found in the lithographic behavior of dark and clear extension.
Additionally (to the determination of the print critical AIMS values) the programmed defect masks were used for the evaluation of a KLA 52x inspection system. The performances of two detection pixels named P125 and P90 in combination with two inspection modes named die-to-die transmission (d2dT) and die-to-die reflective (d2dR) were investigated on 90nm and 70nm dense l/s and contact hole areas with respect to the print results. Over and under-sized small dense structures as well as dark and clear defects centered in a clear or dark structure are challenging for the new inspection tool. For dense contact hole arrays d2dR shows a better performance than d2dT.