27 January 1989 Regression Analysis Of Zernike Polynomials Part II
Author Affiliations +
Abstract
In an earlier paper entitled "Regression Analysis of Zernike Polynomials, Proceedings of SPIE, Vol. 18, pp. 392-398, the least squares fitting process of Zernike polynomials was examined from the point of view of linear statistical regression theory. Among the topics discussed were measures for determining how good the fit was, tests for the underlying assumptions of normality and constant variance, the treatment of outliers, the analysis of residuals and the computation of confidence intervals for the coefficients. The present paper is a continuation of the earlier paper and concerns applications of relatively new advances in certain areas of statistical theory made possible by the advent of the high speed computer. Among these are: 1. Jackknife - A technique for improving the accuracy of any statistical estimate. 2. Bootstrap - Increasing the accuracy of an estimate by generating new samples of data from some given set. 3. Cross-validation - The division of a data set into two halves, the first half of which is used to fit the model and the second half to see how well the fitted model predicts the data. The exposition is mainly by examples.
© (1989) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Louis D. Grey, Louis D. Grey, } "Regression Analysis Of Zernike Polynomials Part II", Proc. SPIE 0965, Current Developments in Optical Engineering III, (27 January 1989); doi: 10.1117/12.948034; https://doi.org/10.1117/12.948034
PROCEEDINGS
9 PAGES


SHARE
RELATED CONTENT

Nature of superconductivity of four-layer HgBa2Ca3Cu4O10+
Proceedings of SPIE (September 05 2000)
Evaluation of SAR ATR
Proceedings of SPIE (June 13 1996)
Serial correlation in the Italian futures market
Proceedings of SPIE (May 22 2005)
Regression Analysis Of Zernike Polynomials
Proceedings of SPIE (December 31 1986)

Back to Top