For the 14nm node and beyond there are many integration strategy decisions that need to be made. All of these can have a significant impact on both alignment and overlay capability and need to be carefully considered from this perspective. One example of this is whether a Litho Etch Litho Etch (LELE) or a Self Aligned Double Patterning (SADP) process is chosen. The latter significantly impacting alignment and overlay mark design. In this work we look at overlay performance for a Back End of Line (BEOL) SADP Dual Damascene (DD) process for the 14nm node. We discuss alignment mark design, particularly focusing on the added complexity and issues involved in using such a process, for example design of the marks in the Metal Core and Keep layers and recommend an alignment scheme for such an integration strategy.
Reducing lithography pattern variability has become a critical enabler of ArF immersion scaling and is required to ensure consistent lithography process yield for sub-30nm device technologies. As DUV multi-patterning requirements continue to shrink, it is imperative that all sources of lithography variability are controlled throughout the product life-cycle, from technology development to high volume manufacturing. Recent developments of new ArF light-source metrology and monitoring capabilities have been introduced in order to improve lithography patterning control. These technologies enable performance monitoring of new light-source properties, relating to illumination stability, and enable new reporting and analysis of in-line performance.
We studied the potential of optical scatterometry to measure the full 3D profile of features representative to
real circuit design topology. The features were selected and printed under conditions to improve the
measurability of the features by scatterometry without any loss of information content for litho monitoring
and control applications. The impact of the scatterometry recipe and settings was evaluated and optimal
settings were determined.
We have applied this strategy on a variety of structures and gathered results using the YieldStar angular
reflection based scatterometer. The reported results show that we obtained effective decoupling of the
measurement of the 3 dimensions of the features. The results match with predictions by calibrated
As a verification we have successfully performed a scanner matching experiment using computational
Pattern Matcher (cPM) in combination with YieldStar as a metrology tool to characterize the difference
between the scanners and verify the matching. The results thus obtained were better than using CD-SEM
for matching and verification.
EUV lithography is a candidate for device manufacturing for the 16nm node and beyond. To prepare for insertion into
manufacturing, the challenges of this new technology need to be addressed. Therefore, the ASML NXE:3100 preproduction
tool was installed at imec replacing the ASML EUV Alpha Demo Tool (ADT). Since the technology has
moved to a pre-production phase, EUV technology has to mature and it needs to meet the strong requirements of sub
16nm devices. We discuss the CD uniformity and overlay performance of the NXE:3100. We focus on EUV specific
contributions to CD and overlay control, that were identified in earlier work on the ADT. The contributions to overlay
originate from the use of vacuum technology and reflective optics inside the scanner, which are needed for EUV light
transmission and throughput. Because the optical column is in vacuum, both wafer and reticle are held by electrostatic
chucks instead of vacuum chucks and this can affect overlay. Because the reticle is reflective, any reticle (clamp)
unflatness directly translates into a distortion error on wafer (non-telecentricity). For overlay, the wafer clamping
performance is not only determined by the exposure chuck, but also by the wafer type that is used. We will show wafer
clamping repeatability with different wafer types and discuss the thermal stability of the wafer during exposure.
In state of the art production, in order to obtain the best possible overlay performance between critical layers, wafers are
often dedicated to one scanner and all layers processed on that scanner, and in the case of scanners with dual stages, this
often extends to stage dedication as well. Meeting the overlay performance requirements becomes even more complex
with the introduction of EUV lithography into production. It will not be possible to expose all critical layers on an EUV
scanner, which will only be used for some of the most critical layers, the other critical layers will remain on 193nm
immersion scanners. It therefore needs to be demonstrated that the same overlay performance is achievable when tool
types are mixed and matched as when we run with tool dedication. To do this it is critical that we understand the overlay
matching characteristics of 193nm immersion and EUV scanners and from this learn how to control them, so that the
optimum strategy can be developed and overlay errors between these tool types minimized.
In this work we look at the matching performance between two generations of 193nm immersion scanner and an EUV
pre-production tool. We evaluate the matching in both directions, first layer on immersion, second layer on EUV and
vice-versa, and demonstrate how optimum matching can be achieved, so that insertion of an EUV scanner into
production for the required imaging does not result in a degraded overlay capability. We discuss the difference in grid
and intrafield signatures between the tool types and how this knowledge can be used to minimize the overlay errors
between them and if there are any new concerns which impact the chosen strategy when the two tool types are mixed
The tightening of overlay budgets forces us to revisit the characterization and control of exposure tools to eliminate
remaining systematic errors. Even though field-to-field overlay has been a known characterization and control technique
for quite some time, there is still room to further explore and exploit the technique. In particular, it can be used to
characterize systematic errors in a scanner's dynamic exposure behavior. In this paper we investigate the modeling of
field-to-field overlay error starting from a scanner point of view. From a set of general equations we show how
systematic dynamic differences between up and down scanned fields can be extracted from field-to-field overlay
measurements in addition to apparent constant effects. We apply our model to characterize scan speed dependent
dynamic behavior and to verify scanner setup.
Metrology on 3D features like line end gap in a SRAM structure is more challenging than on lines and spaces (L/S)
structures. Scatterometry has been widely used on L/S structures and has enabled characterization of lithographic
features providing with critical dimensions (CD) as well as feature height and side wall angle. In this paper, we will
present the application of scatterometry to these challenging structures using an angle resolved polarized scatterometer:
ASML YieldStar S-100. 3D features (line ends, brick walls,...) measurements will be presented. Measurement capability
will be discussed in terms of sensitivity of the parameters of interest and correlation between them leading to a proper
Metrology on 3D features like contact holes (CH) is more challenging than on lines and spaces (L/S) structures
especially if one wants to have profile information. Scatterometry has been widely used on L/S structures and has
enabled characterization of lithographic features providing with critical dimensions (CD) as well as feature height and
side wall angle. In this paper, we will present the application of scatterometry to the measurement of 3D structures using
an angle resolved polarized scatterometer: ASML YieldStar S-100. Contact hole measurements will be presented and
correlation to standard metrology tools will be shown. Measurement capability will be discussed in terms of
reproducibility, calculation time, sensitivity of the parameters of interest and correlation between them leading to a
proper model choice. Finally initial results on more complex 3D features (line ends, brick walls,...) will be presented.
In recent years, numerous authors have reported the advantages of Diffraction Based Overlay (DBO) over Image
Based Overlay (IBO), mainly by comparison of metrology figures of merit such as TIS and TMU. Some have even gone
as far as to say that DBO is the only viable overlay metrology technique for advanced technology nodes; 22nm and
beyond. Typically the only reported drawback of DBO is the size of the required targets. This severely limits its effective
use, when all critical layers of a product, including double patterned layers need to be measured, and in-die overlay
measurements are required.
In this paper we ask whether target size is the only limitation to the adoption of DBO for overlay characterization and
control, or are there other metrics, which need to be considered. For example, overlay accuracy with respect to scanner
baseline or on-product process overlay control? In this work, we critically re-assess the strengths and weaknesses of
DBO for the applications of scanner baseline and on-product process layer overlay control. A comprehensive comparison
is made to IBO. For on product process layer control we compare the performance on critical process layers; Gate,
Contact and Metal. In particularly we focus on the response of the scanner to the corrections determined by each
metrology technique for each process layer, as a measure of the accuracy. Our results show that to characterize an
overlay metrology technique that is suitable for use in advanced technology nodes requires much more than just
evaluating the conventional metrology metrics of TIS and TMU.
Once a process is set-up in an integrated circuit (IC) manufacturer's fabrication environment, any drift in the proximity
fingerprint of the cluster will negatively impact the yield. In complement to the dose, focus and overlay control of the
cluster, it is therefore also of ever growing importance to monitor and maintain the proximity stability (or CD through
pitch behavior) of each cluster.
In this paper, we report on an experimental proximity stability study of an ASML XT:1900i cluster for a 32 nm poly
process from four different angles. First, we demonstrate the proximity stability over time by weekly wafer exposure and
CD through pitch measurements. Second, we investigate proximity stability from tool-to-tool. In a third approach, the
stability over the exposure field (intra-field through-pitch CD uniformity) is investigated. Finally, we verify that
proximity is maintained through the lot when applying lens heating correction.
Monitoring and maintaining the scanner's optical proximity through time, through the lot, over the field, and from toolto-
tool, involves extensive CD metrology through pitch. In this work, we demonstrate that fast and precise CD through
pitch data acquisition can be obtained by scatterometry (ASML YieldStarTM S-100), which significantly reduces the
The results of this study not only demonstrate the excellent optical proximity stability on a XT:1900i exposure cluster for
a 32 nm poly process, but also show how scatterometry enables thorough optical proximity control in a fabrication
IC manufacturers have a strong demand for transferring a working process from one scanner to another. Recently, a
programmable illuminator (FlexRayTM) became available on ASML ArF immersion scanners that, besides all the
parameterized source shapes of the earlier AerialTM illuminator (based on diffractive optical elements) can also produce
any desired freeform source shape. As a consequence, a fabrication environment may have scanners with each of the
illuminator types so both FlexRay-to-Aerial and FlexRay-to-FlexRay matching is of interest. Moreover, the FlexRay
illuminator itself is interesting from a matching point-of-view, as numerous degrees of freedom are added to the
matching tuning space.
This paper demonstrates how the upgrade of an exposure tool from Aerial to FlexRay illuminator shows identical
proximity behavior without any need for scanner tuning. Also, an assessment of the imaging correspondence between
exposure tools each equipped with a FlexRay illuminator is made. Finally, for a series of use-cases where proximity
differences do exist, the application of FlexRay source tuning is demonstrated. It shows an enhancement of the scanner
matching capabilities, because FlexRay source tuning enables matching where traditional NA and sigma tuning are
shortcoming. Moreover, it enables tuning of freeform sources where sigma tuning is not relevant. Pattern MatcherTM
software of ASML Brion is demonstrated for the calculation of the optimized FlexRay tuned sources.
The scatterometry or OCD (Optical CD) metrology technique has in recent years moved from being a general purpose
CD metrology technique to one that addresses the metrology needs of process monitoring and control, where its
strengths can be fully utilized. With the significant advancements that have been made in both hardware and software
design, the setup time required to build complex models and solutions has been significantly reduced. Whilst the
application of scatterometry to process control has clearly shown its merits, the question still arises as to how accurately
the process corrections to feed forward or feedback for process control can be extracted?
In this work we critically examine the accuracy of scatterometry with respect to process control by comparing three
hardware platforms, on a simple litho stack. The impact of hardware design is discussed as well as the 'setup' of the
modeled parameters on the final measurement result. It will be shown that informations extracted based on scatterometry
measurements must be true to process variation and independent of the hardware design. Our results will show that the
ability to use scatterometry effectively for process control ultimately lies in the ability to accurately determine the
changes that have occurred in the process and to be able to extract appropriate process corrections for feedback or feed
forward control; allowing these changes to be accurately corrected. To do this the metrology validation extends beyond
the typical metrology metrics such as precision and TMU; metrology validation with respect to process control must
encompass accurate determination of process corrections to ensure a process tool and/or process stays at the set point.
As critical dimension (CD) control requirements increase and process windows decrease, it is now of even higher
importance to be able to determine and separate the sources of CD error in an immersion cluster, in order to correct for
them. It has already been reported that the CD error contributors can be attributed to two primary lithographic
parameters: effective dose and focus. In this paper, we demonstrate a method to extract effective dose and focus, based
on diffraction based optical metrology (scatterometry). A physical model is used to describe the CD variations of a
target with controlled focus and dose offsets. This calibrated model enables the extraction of effective dose and focus
fingerprints across wafer and across scanner exposure field. We will show how to optimize the target design and the
process conditions, in order to achieve an accurate and precise de-convolution over a larger range of focus and dose than
the expected variation of the cluster.
This technique is implemented on an ASML XT:1900Gi scanner interfaced with a Sokudo RF3S track. The systematic
focus and dose fingerprints obtained by this de-convolution technique enable identification of the specific contributions
of the track, scanner and reticle. Finally, specific corrections are applied to compensate for these systematic CD variations and a significant improvement in CD uniformity is demonstrated.
Numerous metrology tools, techniques and methods are used by the industry to setup and qualify exposure tools for
production. Traditionally, different metrology techniques and tools have been used to setup dose, focus and overlay
optimally and they do so independently. The methods used can be cumbersome, have the potential to interfere with each
other and some even require an unacceptable amount of costly exposure tool time for data acquisition.
In this work, we present a method that uses an advanced angle-resolved scatterometry metrology tool that has the
capability to measure both CD and overlay. By using a technique to de-convolve dose and focus based on the profile
measurement of a well characterized process monitor target, we show that the dose and focus signature of a high NA
193nm immersion scanner can be effectively measured and corrected. A similar approach was also taken to address
overlay errors using the diffraction based overlay capability of our metrology tool. We demonstrate the advantage of having a single metrology tool solution, which enables us to reduce dose, focus and overlay signatures to a minimum.
Given the increasingly stringent CD requirements for double patterning at the 32nm node and beyond the question arises
as to how best to correct for CD non-uniformity at litho and etch. For example, is it best to apply a dose correction over
the wafer while keeping the PEB plate as uniform as possible, or should the dose be kept constant and PEB CD tuning
used to correct. In this work we present experimental data, obtained on a state of the art ASML XT:1900Gi and Sokudo
RF3S cluster, on both of these approaches, as well as on a combined approach utilizing both PEB CD tuning and dose
Ever since the introduction of immersion lithography overlay has been a primary concern. Immersion exposure tools
show an overlay fingerprint that we hope to correct for by introducing correctables per field, i.e. a piece-wise
approximation of the fingerprint but within the correction capabilities of the exposure tool. If this mechanism is to be
used for reducing overlay errors it must be stable over an entire batch. This type of correction requires a substantial
amount of measurements therefore it would be ideal if the fingerprint is also stable over time. These requirements are of
particular importance for double patterning where overlay budgets have been further reduced. Since the variation of the
fingerprint specific to immersion tools creeps directly into the overlay budget, it is important to know how much of the
total overlay error can be attributed to changes in the immersion fingerprint. In this paper we estimate this immersion
specific error but find it to be a very small contributor.
Given the increasingly stringent CD requirements for double patterning at the 32nm node and beyond, the question arises
as to how best to correct for CD non-uniformity at litho and etch. For example, is it best to apply a dose correction over
the wafer while keeping the PEB plate as uniform as possible, or should the dose be kept constant and PEB plate tuning
used to correct. In this paper we present experimental data using both of these approaches, obtained on an ASML
XT:1900Gi and Sokudo RF3S cluster.
With the planned introduction of double patterning techniques, the focus of attention has been on tool overlay
performance and whether or not this meets the required overlay for double patterning. However, as we require tighter
and tighter overlay performance, the impact of the selected integration strategy plays a key part in determining the
achievable overlay performance. Very little attention has been given at this time to the impact of for example deposition
steps, oxidation steps, CMP steps and the impact that they have on wafer deformation and therefore degraded overlay
performance, which directly reduces the available overlay budget. Also, selecting the optimum alignment strategy to
follow, either direct or indirect alignment, plays an important part in achieving optimum overlay performance. In this
paper we investigate the process impact of various double patterning integration strategies and attempt to show the
importance of selecting the right strategy with respect to achieving a manufacturable double patterning process.
Furthermore, we report a methodology to minimize process overlay by modelling the non-linear grids for process
induced wafer deformation and demonstrate best achievable overlay by feeding this information back to the relevant
Monitoring of the focus performance is recognized to be an important part of a periodic scanner health check, but can
one simply apply all techniques that have been used for dry scanners to immersion scanners? And if so how do such
techniques compare to scanner self-metrology tests that are used to set up the tool? In this paper we look at one specific
off-line focus characterization technique, Back Side Chrome (BSC), which we then try to match with results obtained
from two self-metrology focus tests, available on the scanner chosen for this work. The latter tests are also used to set up
the immersion scanner. We point out a few concerns, discuss their effect and indicate that for each generation of
immersion tool one should redo the entire exercise.
The analog switching mode (sometimes referred to as V-shaped switching mode) of the ferroelectric liquid crystal cell is a recently developed type of liquid crystal cell in which the molecular director can be arbitrarily positioned with high speed on the surface of a cone depending on the steering voltage over the cell. This changes the orientation of the slow and fast axes as well as the amount of the birefringence. We show that theoretically, it is possible to use a V-shaped switched ferroelectric liquid crystal cell to achieve near lossless analog phase modulation between zero and π radians for a special ellipticity of the polarized input light. We also fabricated a cell which slightly deviates from the ideal (tilt cone half-angle 38° instead of 45°) for which near-lossless transmission was obtained, manifested as a < 4% modulation of the amplitude, and a continuous phase modulation between 0 and ~0.8π radians; the values agree very well with numerical simulations.
Ferroelectric Liquid Crystal (FLC) Spatial Light Modulators (SLMs) are attractive because of their high switching speed. However, conventional FLC SLMs are only capable of binary phase modulation. This is inconvenient for beam steering since as much as 60% of the incident power is lost to unwanted diffraction orders. To overcome this problem two cascaded FLC SLMs were used in this work. By coherently imaging a 180° binary-phase FLC SLM onto a 90° FLC SLM, with high precision, an effective four-level phase modulator was realized experimentally. Beam steering was demonstrated in the angular range ±10.9 mrad. The angular inaccuracy of the steered beam was found to be about 0.1 mrad, which equals about 25% of the beam diameter. The beam steering device has also been used for tracking experiments.
Antiferroelectric liquid crystal materials are very promising for high-resolution displays but so far suffer from two serious problems, both of which reduce the achievable contrast. These materials are first of all very hard to align to a high quality dark state. Most often this has been attributed to the fact that antiferroelectric materials lack a nematic phase. We believe, however, that there are other reasons behind the bad dark state as well, and that these reasons may be even more important. In addition antiferroelectric materials show a thresholdless linear electro-optic effect, conventionally called the 'pretransitional effect,' which gives a dynamic contribution to light leakage under addressing conditions. We have synthesized and now describe a new type of antiferroelectric material which gives an unprecedented black state due to a high static extinction as well as to the absence of a pretransitional effect. The performance of conventional antiferroelectric liquid crystal displays will be considerably enhanced with this kind of material. Among the numerous non- conventional electro-optic applications of the new material several polarizer-free display modes are described together with fast photonic modulation devices.