Process models, including finite element modeling simulations, are important for optimizing the metal cutting process,
allowing industry to make parts faster, better, and at less cost. Measurements of the process can be used to improve and
verify the accuracy of these models. There are many error sources when using infrared radiation thermography to
measure the temperature distribution of the tool, workpiece, and chip during metal cutting. Furthermore, metal cutting
presents unique measurement challenges due to factors such as the high magnification required, high surface speeds,
micro-blackbody effects, and changing emissivity as chips form.
As part of an ongoing effort to improve our understanding of uncertainties associated with these thermographic
measurements, two sets of experiments were performed. One set explored how well the surface temperature of the
cutting tool accurately reflects the internal temperature. This was accomplished by simultaneously measuring the
temperature using both a thermal camera and a thermocouple embedded within the cutting tool.
The other set investigated correcting for motion blur, point spread function, and a less than ideal range of sensitivity of
the thermal camera when measuring the shear zone temperature of the chip. In theory, this correction could be performed
using deconvolution. Unfortunately, deconvolutions are sensitive to noise and it is difficult to gauge the uncertainty of
the computed values. Thus, convolutions of various assumed inputs were computed and compared to the measured
temperatures. Assumed inputs which yielded a good fit to the measured temperatures were considered candidate values.
The range of those candidate values yields a measure of the uncertainty of the calculation.