Assuming that all exposure tools on which a certain production reticle is being used are from same type and configuration it can be expected that the performance of the reticle should be independent from the exposing machines. When planning or performing arrangements for process transfer between different production sites or capacity expansion within one site performing a proximity matching between different exposure tools is a common activity. One of the objectives of a robust optical proximity correction (OPC) model is to simulate the process variation. Normally, the wafer critical dimension (CD) calibration of an OPC model is applied for one specific scanner first. In order to enhance the tolerance of the OPC model so called fingerprints of different scanners should be matched as closely as possible. Some examples of features for fingerprint test patterns are “critical dimension through pitch” (CDTP), “inverse CDTP”, “tipto-tip” and “linearity patterns”, and CD difference of disposition structures. All of them should also be matched as tightly as possible in order to reduce the process variation and to strengthen the tolerance of an OPC model. However, the focus difference between nested and isolated features which is directly influenced by different exposure tools and reticle layers will have an effect on the proximity matching of some patterns such as inverse CDTP and uniformly distributed disposition structures. In this manuscript the effects of focus differences between nested and isolated features for scanner proximity matching will be demonstrated. Moreover, the results for several scanners and different mask layers using advanced binary mask blank material will also be investigated. Even if some parts of the proximity features are closely enough to each other different parity proximity patterns will be affected by the focus difference between dense and isolated features. Because the focus difference between isolated and dense features is dependent on the illumination conditions, different mask layers applied for a proximity correction will lead to different results. The effects of source variations causing isolated and dense feature focus differences between scanners for 28 nm poly, 1X metal and contact layers will be illustrated.
Chip manufacturing with multilayer reticles offers the possibility to reduce reticle cost at the expense of scanner
throughput, and is therefore an attractive option for small-volume production and test chips. Since 2010,
GLOBALFOUNDRIES Fab 1 uses this option for the 28nm IP shuttles and test chips offered to their customers for
development and advance testing of their products. This paper discusses the advantages and challenges of this approach
and the practical experience gained during implementation. One issue that must be considered is the influence of the
small image field and the asymmetric reticle illumination on the lithographic key parameters, namely layer to layer
overlay. Theoretical considerations and experimental data concerning the effects of lens distortion, lens heating, and
reticle heating on overlay performance are presented, and concepts to address the specific challenges of multilayer
reticles for high-end chip production are discussed.
As optical lithography pushes towards the 32nm node and as the k<sub>1</sub> factor moves toward 0.25, scanner performance and
operational stability are the key enablers to meet device scaling requirements. Achieving these requirements in
production requires stable lithography tools and processes. Stable performance is tracked with respect to pattern to
pattern overlay, nominal focus and critical dimension uniformity (CDU). Within our paper we will characterize the
intrinsic lithographic performance of the scanner and will discuss a new method of machine control to improve the
stability and thus the overall performance of the lithographic solution. This is achieved by measuring specific monitor
wafers, modeling the results by a new software algorithm and constantly feeding back corrective terms to the scanner.
Diffraction-based optical dimensional scatterometry was selected because of its precision, its ability to measure overlay
and focus with a single metrology recipe and its capability to generate greater amounts of measurement data in a shorter
time period than other metrology techniques and platforms.
While monitor wafer performance can be indicative, we will discuss the impact of the new control loop on product. We
will take a closer look at possible interactions with the existing process control loops and work through the configuration
of both internal and fab control loops. We will show improvements in the focus performance on product wafers by using
scatterometry as well. Most importantly we will demonstrate that the newly implemented control loop resulted in a
significant improvement of the CD and overlay performance of critical product layers. This had a very positive impact
on overall process variation and the rework rate at lithography.
Within our paper we are going to discuss the variation within the patterning process in the context of the overall electrical parameter variation in an advanced logic Fab. The evaluation is based on both the variation of ring oscillators that are distributed across the chip as well as on local variation of matched transistor pairs. Starting with a view back to the 130nm technology, we will show how things and requirements changed over time. In particular we focus on the gate layer where we do a detailed ACLV-comparison from the 130nm technology node down to today's 45nm node. Within the patterning variation we keep special attention on the mask performance. Within that section, we do a detailed wafer-mask correlation analysis. Additionally to the low-MEEF gate layer we show the importance of the mask CD-performance for a typical high MEEF-layer. Finally, we discuss the mask contribution to the overall overlay error for the most critical contact to gate overlay. In all of the cases, we will show that the mask performance is not the limiter within today's most advanced technology, as long as we get access to a world class mask shop.
As a consequence of the shrinking sizes of the integrated circuit structures, the overlay budget shrinks as well. Overlay is
traditionally measured with relatively large test structures which are located in the scribe line of the exposure field, in the
four corners. Although the performance of the overlay metrology tools has improved significantly over time it is
questionable if this traditional method of overlay control will be sufficient for future technology nodes. For advanced
lithography techniques like double exposure or double patterning, in-die overlay is critical and it is important to know
how much of the total overlay budget is consumed by in-die components.
We reported earlier that small overlay targets were included directly inside die areas and good performance was
achieved. This new methodology enables a wide range of investigations. This provides insight into processes which
were less important in the past or not accessible for metrology. The present work provides actual data from productive
designs, instead of estimates, illustrating the differences between the scribe line and in-die registration and overlay.
The influence of the pellicle on pattern placement on mask and wafer overlay is studied. Furthermore the registration
overlay error of the reticles is correlated to wafer overlay residuals.
The influence of scanner-induced distortions (tool to tool differences) on in-die overlay is shown.
Finally, the individual contributors to in-die-overlay are discussed in the context of other overlay contributors. It is
proposed to use in-die overlay and registration results to derive guidelines for future overlay and registration
specifications. It will be shown that new overlay correction schemes which take advantage of the additional in-die
overlay information need to be considered for production.
In this publication, the contributors to in-field overlay metrology uncertainty have been parsed and quantified on a back
end process and compared with results from a previous front end study<sup>1</sup>. Particular focus is placed on the unmodeled
systematics, i.e. the components which contribute to residuals in a linear model after removal of random errors. These
are the contributors which are often the most challenging to quantify and are suspected to be significant in the model
residuals. The results show that in both back and front end processes, the unmodeled systematics are the dominant
residual contributor, accounting for 60 to 70% of the variance, even when subsequent exposures are on the same
scanner. A higher order overlay model analysis demonstrates that this element of the residuals can be further dissected
into correctible and non-correctible high order systematics. A preliminary sampling analysis demonstrates a major
opportunity to improve the accuracy of lot dispositioning parameters by transitioning to denser sample plans compared
with standard practices. Field stability is defined as a metric to quantify the field to field variability of the intrafield