One of the major challenges of optical proximity correction (OPC) models is to maximize the coverage of real design
features using sampling pattern. Normally, OPC model building is based on 1-D and 2-D test patterns with
systematically changing pitches alignment with design rules. However, those features with different optical and
geometric properties will generate weak-points where OPC simulation cannot precisely predict resist contours on wafer
due to the nature of infinite IC designs and limited number of model test patterns. In this paper, optical property data of
real design features were collected from full chips and classified to compare with the same kind of data from OPC test
patterns. Therefore sample coverage could be visually mapped according to different optical properties. Design
features, which are out of OPC capability, were distinguished by their optical properties and marked as weak-points.
New patterns with similar optical properties would be added into model build site-list. Further, an alternative and more
efficient method was created in this paper to improve the treatment of issue features and remove weak-points without
rebuilding models. Since certain classification of optical properties will generate weak-points, an OPC-integrated repair
algorithm was developed and implemented to scan full chip for optical properties, locate those features and then
optimize OPC treatment or apply precise sizing on site. This is a named “in-situ” weak-point improvement flow which
includes issue feature definition, allocation in full chip and real-time improvement.
Proc. SPIE. 8327, Design for Manufacturability through Design-Process Integration VI
KEYWORDS: Photovoltaics, Metals, Image processing, 3D modeling, Scanning electron microscopy, Design for manufacturing, Optical proximity correction, Semiconducting wafers, Process modeling, Chemical mechanical planarization
As a result, low fidelity patterns due to process variations can be detected and eventually corrected by designers as early
in the tape out flow as right after design rule checking (DRC); a step no longer capable to totally account for process
constraints anymore. This flow has proven to provide a more adequate level of accuracy when correlating systematic
defects as seen on wafer with those identified through LFD simulations. However, at the 32nm and below, still distorted
patterns caused by process variation are unavoidable. And, given the current state of the defect inspection metrology
tools, these pattern failures are becoming more challenging to detect. In the framework of this paper, a methodology of
advanced process window simulations with awareness of chip topology is presented. This method identifies the expected
focal range different areas within a design would encounter due to different topology.
As the industry progresses toward smaller patterning nodes with tighter CD error budgets and narrower process
windows, the ability to control pattern quality becomes a critical, yield-limiting factor. In addition, as the feature size of
design layouts continues to decrease at 32nm and below, optical proximity correction (OPC) technology becomes more
complex and more difficult. From a lithographic point of view, it is the most important that the patterns are printed as
designed. However, unfavorable localized CD variation can be induced by the lithography process, which will cause
catastrophic patterning failures (i.e. ripple effects, and severe necking or bridging phenomenon) through process
variation. It is becoming even more severe with strong off-axis illumination conditions and other resolution enhancement
techniques (RETs). Traditionally, it can be reduced by optimizing the rule based edge fragmentation in the OPC setup,
but this fragmentation optimization is very dependent upon the engineer's skill. Most fragmentation is based on a set of
simple rules, but those rules may not always be robust in every possible design shape.
In this paper, a model based approach for solving these imaging distortions has been tested as opposed to a previous
rule based one. The model based approach is automatic correction techniques for reducing complexity of the OPC recipe.
This comes in the form of automatically adjusting fragments lengths along with feedback values at every OPC iterations
for a better convergence. The stability and coverage for this model based approach has been tested throughout various
As integrated circuit technology advances and features shrink, the scale of critical dimension (CD) variations induced by
lithography effects become comparable with the critical dimension of the design itself. At the same time, each
technology node requires tighter margins for errors introduced in the lithography process. Optical and process models --
the black boxes that simulate the pattern transfer onto silicon -- are becoming more and more concerned with those
different process errors. As a consequence, an optical proximity correction (OPC) model consists mainly of two parts; a
physical part dealing with the physics of light and its behavior through the lithographical patterning process, and an
empirical part to account for any process errors that might be introduced between writing the mask and sampling
measurements of patterns on wafer. Understanding how such errors can affect a model's stability and predictability, and
taking such errors into consideration while building a model, could actually help convergence, stability, and
predictability of the model when it comes to design patterns other than those used during model calibration and
verification. This paper explores one method to quickly enhance model accuracy and stability.
In microelectronics manufacturing, photolithography is the art of transferring pattern shapes printed on a mask to silicon
wafers by the use of special imaging systems. These imaging systems stopped reducing exposure wavelength at 193nm.
However, the industry demand for tighter design shapes and smaller structures on wafer has not stopped. To overcome
some of the restrictions associated with the photographic process, new methods for Resolution Enhancement Techniques
(RET) are being constantly explored and applied. An essential step in any RET method is Optical Proximity Correction
(OPC). In this process the edges of the target desired shapes are manipulated to compensate for light diffraction effects
and result in shapes on wafer as close as possible to the desired shapes. Manipulation of the shapes is always restricted
by Mask Rules Checks (MRCs). The MRCs are the rules that assure that the pattern coming out of OPC can be printed
on the mask without any catastrophic faults. Essential as they are, MRCs also place constrains on the solutions explored
by the OPC algorithms.
In this paper, an automated algorithm has been implemented to overcome MRC limitations to RET by decomposing the
original layout at the places where regular RET hit the MRC during OPC.This algorithm has been applied to test cases
where simulation results showed much better printability than the normal conventional solutions. This solution has also
been tested and verified on silicon.
In state of the art integrated circuit industry for transistors gate length of 45nm and beyond, the sharp distinction between
design and fabrication phases is becoming inadequate for fast product development. Lithographical information along
with design rules has to be passed from foundries to designers, as these effects have to be taken into consideration during
the design stage to insure a Lithographically Friendly Design, which in turn demands new communication channels
between designers and foundries to provide the needed litho information. In the case of fabless design houses this
requirement is faced with some problems like incompatible EDA platforms at both ends, and confidential information
that can not be revealed by the foundry back to the design house.
In this paper we propose a framework in which we will try to demonstrate a systematic approach to match any
lithographical OPC solution from different EDA vendors into CalibreTM. The goal is to export how the design will look
on wafer from the foundry to the designers without saying how, or requiring installation of same EDA tools.
In the developed framework, we will demonstrate the flow used to match all steps used in developing OPC starting from
the lithography modeling and going through the OPC recipe. This is done by the use of automated scripts that
characterizes the existing OPC foundry solution, and identifies compatible counter parts in the CalibreTM domain to
generate an encrypted package that can be used at the designers' side.
Finally the framework will be verified using a developed test case.