Metrology Fundamentals: Measurement System Characterization and Calibration Using Traditional Definitions
Author Affiliations +
Abstract
Many previously published documents exist describing how to perform measurement system analysis (also known as a gauge study). Some of the more prominent examples of this are the SEMI standard document SEMI E89-0707, a NIST technical note titled “Guidelines for Evaluating the Uncertainty of NIST Measurement Results,” and the International Technology Roadmap for Semiconductors (ITRS) Roadmap for Metrology, which is published every year. This chapter introduces some basic concepts and summarizes some of the major elements covered in these documents. (Chapter 3 discusses limitations of the prior art and the need for addressing these limitations by creating new methodologies that are more appropriate for the IC industry today.) Two words are frequently used to discuss measurement data—precision and accuracy. What are the meanings of these two words and what is their relationship? Figure 2.1 illustrates a frequently used set of diagrams. A “precise” measurement means that the measurement data are close to each other. An “accurate” measurement means that the measurement data are close to the “true” value of the measurement target, which is a known reference. However, to fully calibrate a measurement system, other performance characteristics of a measurement system beyond precision and accuracy are needed to form the metrics.
Online access to SPIE eBooks is limited to subscribing institutions.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Calibration

Metrology

Precision measurement

Semiconductors

Standards development

Back to Top