As Moore's Law drives CD smaller and smaller, overlay budget is shrinking rapidly. Furthermore, the cost of advanced
lithography tools prohibits usage of latest and greatest scanners on non-critical layers, resulting in different layers being
exposed with different tools; a practice commonly known as 'mix and match.' Since each tool has its unique signature,
mix and match becomes the source of high order overlay errors. Scanner alignment performance can be degraded by a
factor of 2 in mix and match, compared to single tool overlay operation. In a production environment where scanners
from different vendors are mixed, errors will be even more significant. Mix and match may also be applied to a single
scanner when multiple illumination modes are used to expose critical levels. This is because different illuminations will
have different impact to scanner aberration fingerprint. The semiconductor technology roadmap has reached a point
where such errors are no longer negligible.
Mix and match overlay errors consist of scanner stage grid component, scanner field distortion component, and process
induced wafer distortion. Scanner components are somewhat systematic, so they can be characterized on non product
wafers using a dedicated reticle. Since these components are known to drift over time it becomes necessary to monitor
them periodically, per scanner, per illumination.
In this paper, we outline a methodology for automating characterization of mix and match errors, and a control system
for real-time correction.