Interferometer wavefront bias is a complicated function of the optics imperfections in the instrument. An interferometric measurement of the figure error on a micro-refractive lens requires careful calibration to separate instrument bias from errors on the part. A self-calibration method such as the random ball test is effective in accomplishing this task without the need for a high-quality calibration artifact. The test, an averaging technique applied to a series of sphere surface patches, allows for calibration of the interferometric wavefront bias. Recent studies of the random ball test have shown that the calibration is affected by ray-trace errors that depend on the curvature of the ball used for the test and becomes significant in the micro-optic range (radius less than 1 millimeter). A comprehensive ray-trace simulation of the random ball test with modifiable variables was created using MATLAB® and ZEMAX® to allow for further investigation into the relationship between test lens misalignment, curvature, numerical aperture, ball figure error, and interferometer bias. The basis for the model hinges on defining a sphere in terms of a set of spherical harmonic functions, and varying the amplitudes and the number of functions to adjust the figure error on the sphere. The flexible simulation can be fine-tuned and used to model a variety of interferometers with different specifications. Our ultimate goal is to confirm the validity of the RBT, determine an efficient method of implementation, and understand the aspects impacting calibration uncertainty.