In an earlier paper entitled "Regression Analysis of Zernike Polynomials, Proceedings of SPIE, Vol. 18, pp. 392-398, the least squares fitting process of Zernike polynomials was examined from the point of view of linear statistical regression theory. Among the topics discussed were measures for determining how good the fit was, tests for the underlying assumptions of normality and constant variance, the treatment of outliers, the analysis of residuals and the computation of confidence intervals for the coefficients. The present paper is a continuation of the earlier paper and concerns applications of relatively new advances in certain areas of statistical theory made possible by the advent of the high speed computer. Among these are: 1. Jackknife - A technique for improving the accuracy of any statistical estimate. 2. Bootstrap - Increasing the accuracy of an estimate by generating new samples of data from some given set. 3. Cross-validation - The division of a data set into two halves, the first half of which is used to fit the model and the second half to see how well the fitted model predicts the data. The exposition is mainly by examples.