Improved overlay capability and sampling to control advanced lithography has accelerated the need for compact, multilayer/
mask/field/mark overlay metrology. The Blossom approach minimizes the size of the overlay marks associated
with each layer while maximizing the density of marks within the overlay metrology tool's field of view (FOV). Here
we describe our progress implementing this approach in 45nm manufacturing.
In a previous publication, we introduced Blossom, a multi-layer overlay mark (Ausschnitt, et al. 2006, ).
Through further testing carried out since that publication, Blossom has been shown to meet the requirements
on current design rules (Ausschnitt, et al. 2007, ), while giving some unique benefits. However, as future
design rules shrink, efforts must be made now to ensure the extensibility of the Blossom technology.
Previous work has shown that the precision component of Total Measurement Uncertainty (TMU) can be
reduced by using extra redundancy in the target design, to achieve performance beyond that of a conventional
box-in-box measurement. However, improvements that single contributor to TMU would not be sufficient for
future design rules; therefore we have also to consider the Tool Induced Shift (TIS) variability and tool to
tool matching contributions to TMU.
In this paper, we introduce a calibration artifact, based on the Blossom technology. The calibration artifact is
both compact, and produced by standard lithography process, so it can be placed in a production scribe line if
required, reducing the need for special sets of calibration wafers compared to other possible calibration
methodologies. Calibration is currently with respect to the exposure tool / process / mask, which is arguably
more pertinent to good yield, and less expensive, than calibration to an external standard; externally
calibrated artifacts would be straightforward to manufacture if needed.
By using this artifact, we can map out remaining optical distortions within an overlay tool, to a precision
significantly better than the operational tool precision, in a way that directly relates to overlay performance.
The effect of process-induced mark uncertainties on calibration can be reduced by performing measurements
on a large number of targets; by taking multiple measurements of each target we can also use the artifact to
evaluate the current levels of process induced mark uncertainty. The former result leads to an improvement
method for TIS and matching capability. We describe the artifact and its usage, and present results from a
group of operational overlay tools.
We show how the use of this information also provides further insight into the layout optimizations discussed
previously (Binns et al. 2006 ). It provides the current limits of measurement precision and mark fidelity
with respect to target redundancy, enabling us to use a predictive cost-benefit term in the optimization.
Finally, examining the bulk behaviour of a fleet of overlay tools, allows us to examine how future mark
layouts can also contribute to minimizing TMU rather than just precision.
The conventional premise that metrology is a "non-value-added necessary evil" is a misleading and dangerous assertion,
which must be viewed as obsolete thinking. Many metrology applications are key enablers to traditionally labeled
"value-added" processing steps in lithography and etch, such that they can be considered integral parts of the processes.
Various key trends in modern, state-of-the-art processing such as optical proximity correction (OPC), design for
manufacturability (DFM), and advanced process control (APC) are based, at their hearts, on the assumption of fine-tuned
metrology, in terms of uncertainty and accuracy. These trends are vehicles where metrology thus has large opportunities
to create value through the engineering of tight and targetable process distributions. Such distributions make possible
predictability in speed-sorts and in other parameters, which results in high-end product. Additionally, significant reliance
has also been placed on defect metrology to predict, improve, and reduce yield variability. The necessary quality
metrology is strongly influenced by not only the choice of equipment, but also the quality application of these tools in a
production environment. The ultimate value added by metrology is a result of quality tools run by a quality metrology
team using quality practices.
This paper will explore the relationships among present and future trends and challenges in metrology, including
equipment, key applications, and metrology deployment in the manufacturing flow. Of key importance are metrology
personnel, with their expertise, practices, and metrics in achieving and maintaining the required level of metrology
performance, including where precision, matching, and accuracy fit into these considerations. The value of metrology
will be demonstrated to have shifted to "key enabler of large revenues," debunking the out-of-date premise that
metrology is "non-value-added." Examples used will be from critical dimension (CD) metrology, overlay, films, and
Overlay tool matching and accuracy issues are quickly reaching a comparable complexity to that of critical
dimensional metrology. While both issues warrant serious investigation, this paper deals with the matching
issues associated with overlay tools. Overlay tools need to run and measure as if they are a single tool -
they need to act as one. In this paper a matching methodology is used to assess a set of overlay tools in a
multiple of overlay applications. The methodology proposed in a prior<sup>2</sup> SPIE paper is applied here to a
fleet of two generations of overlay tools to detect measurement problems not seen with convention
Statistical Process Control techniques. Four studies were used to examine the benefits of this matching
methodology for this fleet of overlay tools. The first study was a matching assessment study. The second
study was a hardware comparison between generations of tools. The third study was a measurement
strategy comparison. The final study was a long term matching exercise where one example of a traditional
long term monitoring strategy was compared to a new long term monitoring strategy. It is shown that this
new tool matching method can be effectively applied to overlay metrology.
A novel approach to overlay metrology, called Blossom, maximizes the number of layers measurable within a single optical field of view (FOV). As chip processing proceeds, each layer contributes a set of at least four marks, arranged symmetrically on concentric circles, to create a 90° rotationally invariant array of marks that "blossoms" to fill the FOV. Radial symmetry about the target center is maintained at each layer to minimize susceptibility to metrology lens aberrations. Overlay combinations among detectable marks within the target can be measured simultaneously. In the described embodiment, 28 distinct layers are represented within a 50μm square FOV. Thus, all the layers of a functional chip can be represented in a single target. Blossom achieves several benefits relative to overlay methods currently in practice:
* Compression (>30X) of area required for overlay targets. * Nullification of within-target proximity effects. * Suppression of optical mark fidelity (OMF) errors. * Reduction of sensitivity to across-target detection noise.* Elimination of overlay error random walk among layers. * Reference mark redundancy for detection flexibility and robustness. * Integration of multi-layer and within-layer overlay control schema. * Simplification of overlay recipe creation and management. * Capture and visualization of overlay performance through the entire chip fabrication.
Blossom results from 65-nm products in manufacturing are described.
A novel overlay target developed by IBM and Accent Optical Technologies, Blossom, allows simultaneous overlay measurements of multiple layers (currently, up to 28) with a single target. This is achieved by a rotationally symmetric arrangement of small (4 micron) targets in a 50 micron square area, described more fully in a separate paper. In this paper, we examine the lessons learned in developing and testing the Blossom design. We start by examining proximity effects; the spacing of adjacent targets means that both the precision-like Total Measurement Uncertainty (TMU) and accuracy of a measurement can be affected by proximity of features. We use a mixture of real and modelled data to illustrate this problem, and find that the layout of Blossom reduces the proximity-induced bias. However, we do find that in certain cases proximity effects can increase the TMU of a particular measurement. The solution is to ensure that parts of the target that interact detrimentally are maximally separated. We present a solution to this, viewing the problem as a constrained Travelling Salesman Problem. We have imposed some global constraints, for example printing front-end and back-end layers on separate targets, and consistency with the overlay measurement strategy. Initially, we assume that pairwise measurements are either critical or non-critical, and optimize the layout so that the critical layers are both not placed adjacent to any prior or intermediate-layer features. We then build upon this structure, to consider the effect of low-energy implants (that cannot be seen once processed) and site re-use possibilities. Beyond this, we also investigate the impact of more strategic optimizations, for example, tuning the number of features on each layer. In each case, we present on-product performance data achieved, and modelled data on some additional target variants / extreme cases.