18 March 2015 Akaike information criterion to select well-fit resist models
Author Affiliations +
Abstract
In the field of model design and selection, there is always a risk that a model is over-fit to the data used to train the model. A model is well suited when it describes the physical system and not the stochastic behavior of the particular data collected. K-fold cross validation is a method to check this potential over-fitting to the data by calibrating with k-number of folds in the data, typically between 4 and 10. Model training is a computationally expensive operation, however, and given a wide choice of candidate models, calibrating each one repeatedly becomes prohibitively time consuming. Akaike information criterion (AIC) is an information-theoretic approach to model selection based on the maximized log-likelihood for a given model that only needs a single calibration per model. It is used in this study to demonstrate model ranking and selection among compact resist modelforms that have various numbers and types of terms to describe photoresist behavior. It is shown that there is a good correspondence of AIC to K-fold cross validation in selecting the best modelform, and it is further shown that over-fitting is, in most cases, not indicated. In modelforms with more than 40 fitting parameters, the size of the calibration data set benefits from additional parameters, statistically validating the model complexity.
© (2015) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Andrew Burbine, Andrew Burbine, David Fryer, David Fryer, John Sturtevant, John Sturtevant, } "Akaike information criterion to select well-fit resist models", Proc. SPIE 9427, Design-Process-Technology Co-optimization for Manufacturability IX, 94270J (18 March 2015); doi: 10.1117/12.2085770; https://doi.org/10.1117/12.2085770
PROCEEDINGS
7 PAGES


SHARE
Back to Top