A lattice-type Monte Carlo–based mesoscale model and simulation of the lithography process have been adapted to study the insoluble particle generation that arises from statistically improbable events. These events occur when there is a connected pathway of soluble material that envelops a volume of insoluble material due to fluctuations in the deprotection profile. The simulation shows that development erodes the insoluble material into the developer stream and produces a cavity on the line edge that can be far larger than a single polymer molecule. The insoluble particles can coalesce to form aggregates that deposit on the wafer surface. The effect of the resist formulation, exposure, postexposure bake, and development variables on particle generation was analyzed in both low- and high-frequency domains. It is suggested that different mechanisms are dominant for the formation of line-edge roughness (LER) at different frequencies. The simulations were used to assess the commonly proposed measures to reduce LER such as the use of low molecular weight polymers, addition of quenchers, varying acid diffusion length, etc. The simulation can be used to help set process variables to minimize the extent of particle generation and LER.
Current minimum feature sizes in the microelectronics industry dictate that molecular interactions affect process fidelity
and produce stochastic excursions like line edge roughness (LER). The composition of future resists is still unknown at
this point, and so simulation of various resist platforms should provide useful information about resist design that
minimizes LER. In the past, researchers developed a mesoscale model for exploring representative 248 nm resist
systems through dynamic Monte Carlo methods and adaptation of critical ionization theory. This molecular modeling
uses fundamental interaction energies combined with a Metropolis algorithm to model the full lithographic process (spin
coat, PAB, exposure, PEB, and development). Application of this model to 193 nm platforms allows for comparison
between 248 and 193 nm resist systems based on molecular interactions. This paper discusses the fundamental
modifications involved in adapting the mesoscale model to a 193 nm platform and investigates how this new model
predicts well-understood lithographic phenomena including the relationship between LER and aerial image, the
relationship between LER and resist components, and the impact of non-uniform PAG distribution in the resist film.
Limited comparisons between the 193 nm system and an analogous 248 nm platform will be discussed.
A lattice-type Monte Carlo based mesoscale model and simulation of the lithography process has been described
previously . The model includes the spin coating, post apply bake, exposure, post exposure bake and development
steps. This simulation has been adapted to study the insoluble particle generation that arises from statistically
improbable events. These events occur when there is a connected pathway of soluble material that envelops a volume of
insoluble material due to fluctuations in the deprotection profile that occur during the post exposure bake .
Development erodes the insoluble material into the developer stream as an insoluble particle. This process may produce
a cavity on the line edge that can be far larger than a single polymer molecule. The insoluble particles generated may
coalesce in developer to form large aggregates of insoluble material that ultimately deposit on the wafer surface and the
tooling. The recent modifications made in mesoscale models for the PEB and dissolution steps, which have enabled this
study are briefly described. An algorithm that was used for particle detection in the current study is also discussed. The
effect of the resist formulation and the different lithographic steps, namely, exposure, post exposure bake and
development, on the extent of particle generation is analyzed. These simulations can be used to set process variables to
minimize the extent of particle generation.
The current scale of features size in the microelectronics industry has reached the point where molecular level
interactions affect process fidelity and produce excursions from the continuum world like line edge roughness (LER).
Here we present a 3D molecular level model based on the adaptation of the critical ionization (CI) theory using a
fundamental interaction energy approach. The model asserts that it is the favorable interaction between the ionized part
of the polymer and the developer solution which renders the polymer soluble. Dynamic Monte Carlo methods were used
in the current model to study the polymer dissolution phenomenon. The surface ionization was captured by employing an
electric double layer at the interface, and polymer motion was simulated using the Metropolis algorithm. The
approximated interaction parameters, for different species in the system, were obtained experimentally and used to
calibrate the simulated dissolution rate response to polymer molecular weight and developer concentration. The
predicted response is in good agreement with experimental dissolution rate data. The simulation results support the
premise of the CI theory and provide an insight into the CI model from a new prospective. This model may provide a
means to study the contribution of development to LER and other related defects based on molecular level interactions
between distinct components in the polymer and the developer.
We present a summary of various methods for inverting top and bottom critical dimension (CD) data to extract dose and
focus information. We explain analytical, numerical, and library inversion techniques in detail, and explore their relative
merits for the purposes of online and offline focus monitoring use models. We also detail the modeling requirements
associated with each inversion technique, and -- for cases where the model form is flexible -- present a cross-validation
methodology for optimizing the response model to fit experimental data. We present modeling and inversion results
from seven exemplary photolithography processes, and study the results from each methodology in detail. While each
method has its own set of advantages and disadvantages, we show that the library method represents the optimum choice
to satisfy a variety of use models while minimizing cost.