In recent years, lithographic printability of overlay metrology targets for memory applications has emerged as a significant issue. Lithographic illumination conditions such as extreme dipole, required to achieve the tightest possible pitches in DRAM pose a significant process window challenge to the metrology target design. Furthermore, the design is also required to track scanner aberration induced pattern placement errors of the device structure. Previous workiii, has shown that the above requirements have driven a design optimization methodology which needs to be tailored for every lithographic and integration scheme, in particular self-aligned double and quadruple patterning methods. In this publication we will report on the results of a new target design technique and show some example target structures which, while achieving the requirements specified above, address a further critical design criterion – that of process resilience.
In this publication the authors have investigated both theoretically and experimentally the link between line edge roughness, target noise and overlay mark fidelity. Based on previous worki , a model is presented to explain how any given edge of a printed feature could have a mean position that varies stochastically (i.e., randomly, following a normal distribution) due to lithography stochastic variation. The amount of variation is a function of the magnitude of the LER (more accurately, all the statistical properties of the LER) and the length of the feature edge. These quantities have been analytically linked to provide an estimate for the minimum line length for both optical and e-beam based overlay metrology. The model results have been compared with experimental results from wafers manufactured at IMEC on both EUV and ArF lithographic processes developed for the 10 nm node, with extrapolation to the 5 nm node.
We demonstrate a novel method to establish a root cause for an overlay excursion using optical Scatterometry metrology. Scatterometry overlay metrology consists of four cells (two per directions) of grating on grating structures that are illuminated with a laser and diffracted orders measured in the pupil plane within a certain range of aperture. State of art algorithms permit, with symmetric considerations over the targets, to extract the overlay between the two gratings. We exploit the optical properties of the target to extract further information from the measured pupil images, particularly information that maybe related to any change in the process that may lead to an overlay excursion. Root Cause Analysis or RCA is being developed to identify different kinds of process variations (either within the wafer, or between different wafers) that may indicate overlay excursions. In this manuscript, we demonstrate a collaboration between Globalfoundries and KLA-Tencor to identify a symmetric process variation using scatterometry overlay metrology and RCA technique.
We present a metrology target design (MTD) framework based on co-optimizing lithography and metrology performance. The overlay metrology performance is strongly related to the target design and optimizing the target under different process variations in a high NA optical lithography tool and measurement conditions in a metrology tool becomes critical for sub-20nm nodes. The lithography performance can be quantified by device matching and printability metrics, while accuracy and precision metrics are used to quantify the metrology performance. Based on using these metrics, we demonstrate how the optimized target can improve target printability while maintaining the good metrology performance for rotated dipole illumination used for printing a sub-100nm diagonal feature in a memory active layer. The remaining challenges and the existing tradeoff between metrology and lithography performance are explored with the metrology target designer’s perspective. The proposed target design framework is completely general and can be used to optimize targets for different lithography conditions. The results from our analysis are both physically sensible and in good agreement with experimental results.
Overlay metrology performances highly depend on the detailed design of the measured target. Hence performing simulations is an essential tool for optimizing target design. We demonstrate for scatterometry overlay (SCOL) three key factors which enable consistency in ranking between simulated and measured metrology performance for target design. The first factor, to enable high fidelity simulations for the purpose of target design, is stack and topography verification of model inputs. We report in detail the best known film metrology methods required to achieve model integrity. The second factor is the method of calculation of metrology performance metrics based on target cell reflectivities from electro-magnetic (EM) simulations. These metrics enable ranking of different designs, and subsequent choice of the best performing designs among all simulated design options, the ranking methodology being the third factor. We apply the above steps to a specific stack, where five different designs have been considered. Simulated versus measured values are compared. A good agreement between simulation and measurement is achieved.
Overlay metrology target design is an essential step prior to performing overlay measurements. This step is done through the optimization of target parameters for a given process stack. A simulation tool is therefore used to improve measurement performances. This work shows how our Metrology Target Design (MTD) simulator helps significantly in the target design process. We show the role of film and Optical CD measurements in improving significantly the fidelity of the simulations. We demonstrate that for various target design parameters we are capable of predicting measured performance metrics by simulations and correctly rank various designs performances.
We present a novel metrology target design framework using the scanner exit pupil wavefront analysis together with Zernike sensitivity analysis (ZSA) based on the Monte-Carlo technique. The proposed method enables the design of robust metrology targets that maximize target process window (PW) while minimizing placement error discrepancies with device features in the presence of spatial and temporal variation of the aberration characteristics of an exposure tool. Knowing the limitations of lithography systems, design constraints, and detailed lithography information including illumination, mask type, etc., we can successfully design an optimal metrology target. We have validated our new metrology target design (MTD) method for one of the challenging DRAM active layer consisting of diagonal line and space patterns illuminated by a rotated extreme dipole source. We find that an optimal MTD target gives the maximized PW and the strong device correlation, resulting in the dramatic improvement of overall overlay performance. The proposed target design framework is completely general and can be used to optimize targets for different lithography conditions. The results from our analysis are both physically sensible and in good agreement with experimental results.
Computational metrology target design requires both an accurate metrology simulation engine and an accurate geometric model. This paper deals with the later. Optical critical dimension metrology and cross-section SEM are demonstrated as two useful methods of geometric model verification with differing capabilities. Specifically, a methodology is proposed which allows the metrology engineer to quantify the level of accuracy required by the model as a function of the tolerable uncertainty in the prediction of metrology performance metrics. The methodology identifies a subset of model parameters which need to be verified enabling the metrology engineer to invest the minimum effort in stack and topography verification which will lead to performing target designs on the first design round.
The double patterning (DPT) process is foreseen by the industry to be the main solution for the 32 nm technology node
and even beyond. Meanwhile process compatibility has to be maintained and the performance of overlay metrology has
to improve. To achieve this for Image Based Overlay (IBO), usually the optics of overlay tools are improved. It was also
demonstrated that these requirements are achievable with a Diffraction Based Overlay (DBO) technique named SCOL<sup>TM</sup>
. In addition, we believe that overlay measurements with respect to a reference grid are required to achieve the
required overlay control . This induces at least a three-fold increase in the number of measurements (2 for double
patterned layers to the reference grid and 1 between the double patterned layers). The requirements of process
compatibility, enhanced performance and large number of measurements make the choice of overlay metrology for DPT
In this work we use different flavors of the standard overlay metrology technique (IBO) as well as the new technique
(SCOL) to address these three requirements. The compatibility of the corresponding overlay targets with double
patterning processes (Litho-Etch-Litho-Etch (LELE); Litho-Freeze-Litho-Etch (LFLE), Spacer defined) is tested. The
process impact on different target types is discussed (CD bias LELE, Contrast for LFLE). We compare the standard
imaging overlay metrology with non-standard imaging techniques dedicated to double patterning processes (multilayer
imaging targets allowing one overlay target instead of three, very small imaging targets). In addition to standard designs
already discussed , we investigate SCOL target designs specific to double patterning processes. The feedback to the
scanner is determined using the different techniques. The final overlay results obtained are compared accordingly. We
conclude with the pros and cons of each technique and suggest the optimal metrology strategy for overlay control in
double patterning processes.
It could be argued that the biggest challenge of the 32 nm half pitch node is the production implementation of double patterning lithography. Within the framework of this broad domain, a specific challenge which has been highlighted is overlay control due to the sharing between two exposures the overlay control allocation of a single patterning step. The models used in the literature to support this assertion are reviewed and compared with recent results. An analysis of the implications for overlay metrology performance and cost of ownership is presented and compared with actual capabilities currently available with both imaging and scatterometry sensor technology. Technology matching between imaging and scatterometry emerges as a requirement to enable combined imaging scatterometry overlay control use cases.
The overlay metrology budget is typically 1/10 of the overlay control budget resulting in overlay metrology total
measurement uncertainty requirements of 0.57 nm for the most challenging use cases of the 32nm technology generation.
Theoretical considerations show that overlay technology based on differential signal scatterometry (SCOL<sup>TM</sup>) has
inherent advantages, which will allow it to achieve the 32nm technology generation requirements and go beyond it.
In this work we present results of an experimental and theoretical study of SCOL. We present experimental results,
comparing this technology with the standard imaging overlay metrology. In particular, we present performance results,
such as precision and tool induced shift, for different target designs. The response to a large range of induced
misalignment is also shown. SCOL performance on these targets for a real stack is reported. We also show results of
simulations of the expected accuracy and performance associated with a variety of scatterometry overlay target designs.
The simulations were carried out on several stacks including FEOL and BEOL materials. The inherent limitations and
possible improvements of the SCOL technology are discussed. We show that with the appropriate target design and
algorithms, scatterometry overlay achieves the accuracy required for future technology generations.
Resolution enhancement in advanced optical lithography will reach a new plateau of complexity at the 32 nm design rule
manufacturing node. In order to circumvent the fundamental optical resolution limitations, ultra low k<sub>1</sub> printing
processes are being adopted, which typically involve multiple exposure steps. Since alignment performance is not
fundamentally limited by resolution, it is expected to yield a greater contribution to the effort to tighten lithographic error
budgets. In the worst case, the positioning budget usually allocated to a single patterning step is divided between two. A
concurrent emerging reality is that of high order overlay modeling and control. In tandem with multiple exposures, this
trend creates great pressure to reduce scribeline target real estate per exposure. As the industry migrates away from
metrology targets formed from large isolated features, the adoption of dense periodic array proxies brings improved
process compatibility and information density as epitomized by the AIM target<sup>1</sup>. These periodic structures enable a
whole range of new metrology sensor architectures, both imaging and scatterometry based, that rely on the principle of
diffraction order control and which are no longer aberration limited. Advanced imaging techniques remain compatible
with side-by-side targets while scatterometry methods require grating-over-grating targets. In this paper, a number of
different imaging and scatterometry architectures are presented and compared in terms of random errors, systematic
errors and scribespace requirements. It is asserted that an optimal solution must combine the TMU peak performance
capabilities of scatterometry with the cost of ownership advantages of target size and multi-layer capabilities of imaging.
In the lithography section of the ITRS 2006 update, at the top of the list of difficult challenges appears the text "overlay
of multiple exposures including mask image placement". This is a reflection of the fact that today overlay is becoming a
major yield risk factor in semiconductor manufacturing. Historically, lithographers have achieved sufficient alignment
accuracy and hence layer to layer overlay control by relying on models which define overlay as a linear function of the
field and wafer coordinates. These linear terms were easily translated to correctibles in the available exposure tool
degrees of freedom on the wafer and reticle stages. However, as the 45 nm half pitch node reaches production, exposure
tool vendors have begun to make available, and lithographers have begun to utilize so called high order wafer and field
control, in which either look up table or high order polynomial models are modified on a product by product basis. In
this paper, the major challenges of this transition will be described. It will include characterization of the sources of
variation which need to be controlled by these new models and the overlay and alignment sampling optimization
problem which needs to be addressed, while maintaining the ever tightening demands on productivity and cost of
The overlay control budget for the 32nm technology node will be 5.7nm according to the ITRS. The overlay metrology
budget is typically 1/10 of the overlay control budget resulting in overlay metrology total measurement uncertainty
(TMU) requirements of 0.57nm for the most challenging use cases of the 32nm node. The current state of the art
imaging overlay metrology technology does not meet this strict requirement, and further technology development is
required to bring it to this level. In this work we present results of a study of an alternative technology for overlay
metrology - Differential signal scatterometry overlay (SCOL). Theoretical considerations show that overlay technology
based on differential signal scatterometry has inherent advantages, which will allow it to achieve the 32nm technology
node requirements and go beyond it. We present results of simulations of the expected accuracy associated with a
variety of scatterometry overlay target designs. We also present our first experimental results of scatterometry overlay
measurements, comparing this technology with the standard imaging overlay metrology technology. In particular, we
present performance results (precision and tool induced shift) and address the issue of accuracy of scatterometry
overlay. We show that with the appropriate target design and algorithms scatterometry overlay achieves the accuracy
required for future technology nodes.
Layer to layer alignment in optical lithography is controlled by feedback of scanner correctibles provided by analysis
of in-line overlay metrology data from product wafers. There is mounting evidence that the "high order" field
dependence, i.e. the components which contribute to residuals in a linear model of the overlay across the scanner field
will likely need to be measured in production scenarios at the 45 and 32 nm half pitch nodes. This is in particular
true in immersion lithography where thermal issues are likely to impact intrafield overlay and double pitch patterning
scenarios where the high order reticle feature placement error contribution to the in-die overlay is doubled.
Production monitoring of in-field overlay must be achieved without compromise of metrology performance in order to
enable sample plans with viable cost of ownership models. In this publication we will show new results of in-die
metrology, which indicate that metrology performance comparable with standard scribeline metrology required for the
45 nm node is achievable with significantly reduced target size. Results from dry versus immersion on poly to active
45 nm design rule immersion lithography process layers indicate that a significant reduction in model residuals can be
achieved when HO intrafield overlay models are enabled.
As Moore's Law drives CD smaller and smaller, overlay budget is shrinking rapidly. Furthermore, the cost of advanced
lithography tools prohibits usage of latest and greatest scanners on non-critical layers, resulting in different layers being
exposed with different tools; a practice commonly known as 'mix and match.' Since each tool has its unique signature,
mix and match becomes the source of high order overlay errors. Scanner alignment performance can be degraded by a
factor of 2 in mix and match, compared to single tool overlay operation. In a production environment where scanners
from different vendors are mixed, errors will be even more significant. Mix and match may also be applied to a single
scanner when multiple illumination modes are used to expose critical levels. This is because different illuminations will
have different impact to scanner aberration fingerprint. The semiconductor technology roadmap has reached a point
where such errors are no longer negligible.
Mix and match overlay errors consist of scanner stage grid component, scanner field distortion component, and process
induced wafer distortion. Scanner components are somewhat systematic, so they can be characterized on non product
wafers using a dedicated reticle. Since these components are known to drift over time it becomes necessary to monitor
them periodically, per scanner, per illumination.
In this paper, we outline a methodology for automating characterization of mix and match errors, and a control system
for real-time correction.
In this publication, the contributors to in-field overlay metrology uncertainty have been parsed and quantified on a back
end process and compared with results from a previous front end study<sup>1</sup>. Particular focus is placed on the unmodeled
systematics, i.e. the components which contribute to residuals in a linear model after removal of random errors. These
are the contributors which are often the most challenging to quantify and are suspected to be significant in the model
residuals. The results show that in both back and front end processes, the unmodeled systematics are the dominant
residual contributor, accounting for 60 to 70% of the variance, even when subsequent exposures are on the same
scanner. A higher order overlay model analysis demonstrates that this element of the residuals can be further dissected
into correctible and non-correctible high order systematics. A preliminary sampling analysis demonstrates a major
opportunity to improve the accuracy of lot dispositioning parameters by transitioning to denser sample plans compared
with standard practices. Field stability is defined as a metric to quantify the field to field variability of the intrafield
Bright field imaging based metrology performance enhancement is essential in the quest to meet lithography process control requirements below 65 nm half pitch. Recent work has shown that, in parallel to the lithographic processes themselves, the metrology tools are able to continue to perform despite the fact that the size of the features under test are often below the classical Rayleigh resolution limit of the optical system. Full electromagnetic simulation is a mandatory tool in the investigation and optimization of advanced metrology tool and metrology target architectures. In this paper we report on imaging simulations of overlay marks. We benchmark different simulation platforms and methods, focusing in particular on the challenges associated with bright-field imaging overlay metrology of marks with feature sizes below the resolution limit. In particular, we study the dependence of overlay mark contrast and information content on overlay mark pitch and feature size.
In this publication, the contributors to in-field overlay metrology uncertainty have been parsed and quantified in a specific case study. Particular focus is placed on the unmodeled systematics, i.e. the components which contribute to residuals in a linear model after removal of random errors. These are the contributors which are often the most challenging to quantify and are suspected to be significant in the model residuals. The results show that even in a relatively "clean" front end process, the unmodeled systematics are the dominant residual contributor, accounting for 60 to 70% of the variance. Given the above results, new sampling and modeling methods are proposed which have the potential to improve the accuracy of modeled correctibles and lot dispositioning parameters.
An improved overlay mark design was applied in high end semiconductor manufacturing to increase the total overlay measurement accuracy with respect to the standard box-in-box target. A comprehensive study has been conducted on the basis of selected front-end and back-end DRAM layers (short loop) to characterize contributors to overlay error. This analysis is necessary to keep within shrinking overlay budget requirements.
Isolated and dense patterns were formed at process layers from gate through to back-end on wafers using a 90 nm logic device process utilizing ArF lithography under various lithography conditions. Pattern placement errors (PPE) between AIM grating and BiB marks were characterized for line widths varying from 1000nm to 140nm. As pattern size was reduced, overlay discrepancies became larger, a tendency which was confirmed by optical simulation with simple coma aberration. Furthermore, incorporating such small patterns into conventional marks resulted in significant degradation in metrology performance while performance on small pattern segmented grating marks was excellent. Finally, the data also show good correlation between the grating mark and specialized design rule feature SEM
marks, with poorer correlation between conventional mark and SEM mark confirming that new grating mark significantly improves overlay metrology correlation with device patterns.
Overlay metrology for production line-monitor and advanced process control (APC) has been dominated by 4-corner box-in-box (BiB) methods for many years. As we proceed following the ITRS roadmap with the development of 65 nm technologies and beyond, it becomes apparent that current overlay methodologies are becoming inadequate for the stringent requirements that lie ahead. It is already apparent that kerf metrology of large scale BiB structures does not
correlate well with in-chip design-rule features. The recent introduction of the Advanced Imaging Metrology (AIM) target, utilizing increased information content and advanced design and process compatibility, has demonstrated significant improvements in precision and overlay mark fidelity (OMF) in advanced processes. This paper compares methodologies and strategies for addressing cross-field variation of overlay and pattern placement issues. We compare the trade-offs of run-time intra-field sampling plans and the use of off-line lithography characterization and advanced
modeling analysis, and propose new methodologies to address advanced overlay metrology and control.
We have developed a method for calculating the statistical effects of spatial noise on the overlay measurement extracted from a given overlay target. The method has been applied to two kinds of overlay targets on three process layers, and the new metric, Target Noise, has been shown to correlate well to the random component of Overlay Mark Fidelity. A significant difference in terms of robustness has been observed between AIM targets and conventional Frame-in-Frame targets. The results fit well into the spatial noise hierarchy presented in this paper.
We explore the implementation of improved overlay mark designs increasing mark fidelity and device correlation for advanced wafer processing. The effect of design rule segmentation on overlay mark performance is studied. Short loop wafers with 193 nm lithography for front-end (poly to STI active) as well as back-end (via to metal) were processed and evaluated. A comparison of 6 different box-in-box (BiB) overlay marks, including non-segmented, multi bar, and design-rule segmented were compared to several types of AIM (Advanced Imaging Metrology) grating targets which were non-segmented and design rule segmented in various ways. The key outcomes of the performance study include the following: the total measurement uncertainty (TMU) was estimated by the RMS of the precision, TIS 3-sigma and overlay mark fidelity (OMF). The TMU calculated in this way show a 40% reduction for the grating marks compared to BiB. The major contributors to this performance improvement were OMF and precision, which were both improved by nearly a factor of 2 on the front-end layer. TIS-3-sigma was observed to improve when design rule segmentation was implemented, while OMF was marginally degraded. Similar results were found for the back end wafers. Several different pitches and segmentation schemes were reviewed and this has allowed the development of a methodology for target design optimization. Resulting improvements in modeled residuals were also achieved.
In this publication we introduce a new metric for process robustness of overlay metrology in microelectronic manufacturing. By straightforward statistical analysis of overlay metrology measurements on an array of adjacent, nominally identical overlay targets the Overlay Mark Fidelity (OMF) can be estimated. We present the results of such measurements and analysis on various marks, which were patterned using a DUV scanner. The same reticle set was used to pattern wafers on different process layers and process conditions. By appropriate statistical analysis, the breakdown of the total OMF into a reticle-induced OMF component and a process induced OMF component was facilitated. We compare the OMF of traditional box-in-box overlay marks with that of new gratingbased overlay marks and show that in all cases the grating marks are superior. The reticle related OMF showed an improvement of 30 % when using the new grating-based overlay mark. Furthermore, in a series of wafers run through an STI-process with different Chemical Mechanical Polish (CMP) times, the random component of the OMF of the new grating-based overlay mark was observed to be 40% lower and 50% less sensitive to process variation compared with Box in Box marks. These two observations are interpreted as improved process robustness of the grating mark over box in box, specifically in terms of reduced site by site variations and reduced wafer to wafer variations as process conditions change over time. Overlay Mark Fidelity, as defined in this publication, is a source of overlay metrology uncertainty, which is statistically independent of the standard error contributors, i.e. precision, TIS variability, and tool to tool matching. Current overlay metrology budgeting practices do not take this into consideration when calculating total measurement uncertainty (TMU). It is proposed that this be reconsidered, given the tightness of overlay and overlay metrology budgets at the 70 nm design rule node and below.
We have previously reported on an overlay metrology simulation platform, used for modeling both the effects of overlay metrology tool behavior and the impact of target design on the ultimate metrology performance. Since our last report, the simulation platform has been further enhanced, consisting now of eleven PCs and running commercial software both for lithography (PROLITH) and rigorous Maxwell calculations (EM-Suite). In this paper we report on the validation of the metrology simulations by comparing them to both analytical calculations and to experimental results. The analytical validation is based on the classical calculation of the diffraction of a polarized plane wave from a perfectly conducting half plane. For the experimental validation, we chose an etched silicon wafer manufactured by International SEMATECH (ISMT) and characterized at National Institute of Science and Technology (NIST). The advantages of this wafer are its well known topography and its suite of different metrology targets. A good fit to both analytical and experimental results is demonstrated, attesting to the capabilities of our enhanced simulation platform. The results for both the analytical and experimental validations are presented.
A common path interferometric element introduced in the optical path of an imaging device is a well documented method to perform multidimensional spectroscopy. Recent design modifications however have provided significant improvements including enhanced spectral resolution and optical throughput, reduced acquisition time, as well as reduced instrument weight and volume. The new design will be reviewed in addition to its impact on three applications: spectral karyotyping, spectral imaging of the human ocular fundus and remote sensing of water reservoirs.
Aluminum metallization is an important process for planarization and interconnect applications. Wafer temperature during deposition is one of the key parameter determining film properties such as reflectivity and resistivity. Results of experiments carried out in order to characterize the thermal behavior of product wafers during physical vapor deposition, primarily aluminum and wafer degas will be presented. The effects of back and front side depositions, backside gas pressure and plasma power level on deposition temperature are all investigated. The utility of real time in-situ temperature monitoring on every product wafer in all deposition chambers within a cluster tool and the advantages provided in terms of process monitoring are discussed.
In this paper an analysis technique is presented which allows the achievable performance specifications for a single wavelength pyrometer to be calculated. The effects of pyrometer wavelength, wafer emissivity, background radiation and detector noise limitations are all taken into account in the modelling. It is demonstrated that in order to maintain a given precision the wavelength of the pyrometer must be progressively reduced in order to maintain radiance contrast as the wafer temperature rises. The analysis technique is also shown to be an effective design tool for determining the required electronic and optical performance specifications of the pyrometer in order to obtain a given temperature measurement precision.
Remote, noncontact temperature monitoring of semiconductors may be achieved by near infrared reflection spectroscopy of a wafer during processing. A technique is described which relies on the temperature dependence of the optical absorption edge characteristic of most semiconductors in conjunction with internal reflection at the interface between the wafer bulk and the vacuum/dielectric/device. Results are presented which demonstrate application of the technique to silicon wafers with a broad range of back surface properties such as single and double layer dielectrics. The measurements were carried out in situ during process in both a PVD metallization chamber and a plasma etch chamber, over the temperature range from 20 to 570 degree(s)C.
Weapons delivery systems frequently use laser designators which
require boresighting with the visual line of sight. For systems
based on the Nd .YAG laser (A = 1 . O6jnrt) , this may be achieved by a
boresight collimator with a second harmonic generating crystal in
its focal plane which reradiates collimated, visible light (A
0 . 53/nfl) . The conception, design and testing of such a device will
be described along with a comparison with alternative technologies
which demonstrates its superiority in terms of conversion
efficiency, damage threshold and design versatility.
We have conducted experiments to prove feasibility of a
boresighting method between a laser and a Forward Looking Infra-
Red (FLIR) system, which has the advantage of working with or
without synchronization between the laser pulses and the FLIR
scanning. The method is based on the Thermal Target concept (TT);
the laser energy is focused on a special substrate which is locally
heated and produces a point image on the FLIR screen with respect
to the FLIR line of sight which is boresighted. Resistance to laser
damage by the required pulse energy densities was established by
target lifetime measurements.
The TT method can be also used for real time boresighting of the
laser with the FLIR, which means that the boresighting is done
while looking at the scenery.