This PDF file contains the front matter associated with SPIE
Proceedings Volume 7066, including the Title Page, Copyright
information, Table of Contents, Introduction (if any), and the
Conference Committee listing.
As the global R&D competition is intensified, more speedy measurement instruments are required both in laboratories and production process. In machinery areas, while contact type coordinate measuring machines (CMM) have been widely used, noncontact type CMMs are growing its market share which are capable of measuring enormous number of points at once. Nevertheless, since no industrial standard concerning an accuracy test of noncontact CMMs exists, each manufacturer writes the accuracy of their product according to their own rules, and this situation gives confusion to customers. The working group ISO/TC 213/WG 10 is trying to make a new ISO standard which stipulates an accuracy test of noncontact CMMs. The concept and the situation of discussion of this new standard will be explained. In National Metrology Institute of Japan (NMIJ), we are collecting measurement data which serves as a technical background of the standards together with a consortium formed by users and manufactures. This activity will also be presented.
The field of 3D optical metrology has seen significant growth in the commercial market in recent years. The methods of using structured light to obtain 3D range data is well documented in the literature, and continues to be an area of development in universities. However, the step between getting 3D data, and getting geometrically correct 3D data that can be used for metrology is not nearly as well developed. Mechanical metrology systems such as CMMs have long established standard means of verifying the geometric accuracies of their systems. Both local and volumentric measurments are characterized on such system using tooling balls, grid plates, and ball bars. This paper will explore the tools needed to characterize and calibrate an optical metrology system, and discuss the nature of the geometric errors often found in such systems, and suggest what may be a viable standard method of doing characterization of 3D optical systems. Finally, we will present a tradeoff analysis of ways to correct geometric errors in an optical systems considering what can be gained by hardware methods versus software corrections.
Recently, Range Imaging (RIM) cameras have become available that capture high resolution range images at video
rate. Such cameras measure the distance from the scene for each pixel independently based upon a measured time of
flight (TOF). Some cameras, such as the SwissRanger(tm) SR-3000, measure the TOF based on the phase shift of reflected
light from a modulated light source. Such cameras are shown to be susceptible to severe distortions in the measured
range due to light scattering within the lens and camera. Earlier work induced using a simplified Gaussian point spread
function and inverse filtering to compensate for such distortions. In this work a method is proposed for how to identify
and use generally shaped empirical models for the point spread function to get a more accurate compensation. The
otherwise difficult inverse problem is solved by using the forward model iteratively, according to well established
procedures from image restoration. Each iteration is done as a sequential process, starting with the brightest parts of the
image and then moving sequentially to the least bright parts, with each step subtracting the estimated effects from the
measurements. This approach gives a faster and more reliable compensation convergence. An average reduction of the
error by more than 60% is demonstrated on real images. The computation load corresponds to one or two convolutions of
the measured complex image with a real filter of the same size as the image.
Complex optical free-form surfaces are very common optical components for the use in modern illumination and lighting systems. In this paper we describe the use of high accurate fringe projection for the measurement of optical free-form surfaces with a resolution in the sub-µm-range.
To achieve the required high accuracy the method of uniform measurement scale in fringe projection, proposed by the authors some years ago, is used 1. The basic idea is the exclusive use of phase values for the 3D-data calculation. Because of that, the accuracy of such a measurement set-up is mainly restricted by the lens distortion of the projection system. In order to compensate this we introduce a new method for a 3D-correction of the distortion of the projection lens taking into account spatial dependent distortion parameters. The distortion of the projection lens is determined by a measurement of reference planes which will be used to calculate a 3D-correction matrix. This matrix covers the whole measurement volume (lateral and vertical) and contains the determined distortion of the projection system. As a result the accuracy of the correction improved the absolute accuracy by a factor of four.
Furthermore, the data quality is enhanced by a further factor of two using a wavelet filtering for noise reduction. The realized measurement set-up has a measurement field of up to 180 mm in diameter. It will be shown that the measurement of optical free form surfaces with medium range accuracy will be possible where we have reached a limit of 0.5 µm RMS error in a measuring field of 70 mm diameter.
In this paper, a real-time shape measurement system using pixel-by-pixel calibration tables is developed. We proposed a shape measurement method using pixel-by-pixel calibration tables produced with multiple reference planes. In this method, all the relationships between the phase of the projected grating and the spatial coordinates can be obtained for each pixel. This method excludes a lens distortion and intensity errors of the projected grating in measurement results theoretically. Tabulation makes short-time measurement possible. The linearity of each pixel of a camera is also corrected using pixel-by-pixel calibration tables for linearity immediately after grabbing images.
This paper presents a very simple and effective procedure to combine data from two cameras, and different positions, to
produce clouds of points in regular meshes. The main idea starts by setting two independent coordinates for a node from
a regular mesh. The third coordinate is found by scanning the dependent coordinate across the measurement volume until
the phase values of the fringe patterns, acquired by the cameras, reach the same common value. This approach naturally
produces structured clouds of points independently of the number of cameras used. To measure large or complex
volumes, some marks are distributed by the geometry. Two geometry parts with common marks are measured in
different positions and stitched. Many parts can be measured and stitched, until the complete measurement of geometry.
The final result is a regular mesh of the cloud of points.
A method is proposed for surface defect analysis and evaluation. Good 3D point clouds can now be obtained through a variety of surface profiling methods such as stylus tracers, structured light, or interferometry. In order to inspect a surface for defects, first a reference surface that represents the surface without any defects needs to be identified. This reference surface can then be fit to the point cloud. The algorithm we present finds the least square solution for the overdetermined equation set to obtain the parameters of the reference surface mathematical description. The distance between each point within the point cloud and the reference surface is then calculated using to the derived reference surface equation. For analysis of the data, the user can preset a threshold distance value. If the calculated distance is bigger than the threshold value, the corresponding point is marked as a defect point. The software then generates a color-coded map of the measured surface. Defect points that are connected together are formed into a defect-clustering domain. Each defect-clustering domain is treated as one defect area. We then use a clustering domain searching algorithm to auto-search all the defect areas in the point cloud. The different critical parameters used for evaluating the defect status of a point cloud that can be calculated are described as: P-Depth,a peak depth of all defects; Defect Number, the number of surface defects; Defects/Area, the defect number in unit area; and Defect Coverage Ratio which is a ratio of the defect area to the region of interest.
A new passive ranging technique named Robust Depth-from-Defocus (RDFD) is presented for autofocusing in digital
cameras. It is adapted to work in the presence of image shift and scale change caused by camera/hand/object motion.
RDFD is similar to spatial-domain Depth-from-Defocus (DFD) techniques in terms of computational efficiency, but it
does not require pixel correspondence between two images captured with different defocus levels. It requires
approximate correspondence between image regions in different image frames as in the case of Depth-from-Focus (DFF)
techniques. Theory and computational algorithm are presented for two different variations of RDFD. Experimental
results are presented to show that RDFD is robust against image shifts and useful in practical applications. RDFD also
provides insight into the close relation between DFF and DFD techniques.
Measuring objects with high surface reflectivity variations (i.e., high dynamic range) is challenging for any optical method.
This paper addresses a high dynamic range scanning (HDRS) technique that can measure this type of objects. It takes
advantage of one merit of a phase-shifting algorithm: pixel-by-pixel phase retrieval. For each measurement, multiple shots
of fringe images with different exposures are taken. And a sequence of fringe images with different overall brightness are
captured: the brightest fringe images have good fringe quality for darker areas although the brighter areas may be saturated;
while the darkest fringe images have good fringe quality in brighter areas although the fringes in the darker areas may be
invisible. The sequence of fringe images is arranged from brighter to darker, i.e., from higher exposure to lower exposure.
The final fringe images, used for phase retrieval, are produced pixel-by-pixel by choosing the brightest but the unsaturated
corresponding pixel from one shot. A phase-shifting algorithm is employed to compute the phase, which can be further
converted to coordinates. Our experiments demonstrate that the proposed technique can successfully measure objects with
high dynamic range of surface properties.
The difficulty of implementing the phase shifting method in shadow moiré lies in the fact that the phase shift due to the displacement of the light source, the imaging sensor, or the grating is non-uniform across the field of view. Typical phase shifting algorithms fail to produce accurate results. In the past few decades, various approximation methods have been developed to overcome this difficulty. In this paper, we describe an elegant solution that provides exact close-form result. In our proposed system, the grating is translated in equal steps to introduce phase shifts. The phase value at each point is determined by the Carré algorithm, which only requires uniform phase shifts for each point, instead of in the whole field of view. The 3-D shape of the object is then reconstructed from the phase map retrieved from the Carré algorithm. The simulation results demonstrate the effectiveness of the Carré algorithm for shadow moiré.
A ring beam device consisting of a conical mirror and a LD is available to form a disk-like beam sheet. We have shown measurement result of an inner diameter of pipes and holes and have developed a compact inner profile measuring instrument up to now. The ring beam device and a miniature CCD camera are incorporated in a glass tube. Validity of this instrument was shown by checking the inner profile of the references. In response to this trial, there appeared a strong request that not only an internal but also external inspection should be measured. Surely the pattern projection method conventionally used in 3D profile measurement may be useful for high speed and high precision measurement of various objects, but it is not always appropriate to measure an object with steeply inclined surface profile such as a bevel gear. In this paper, we propose a method for measurement of the external profile in addition to the internal profile. In our arrangement, one pair of concaved conical mirrors is used for the external profile measurement. When combined with our inner profile measurement method, simultaneous measurement of the inner and outer profiles becomes possible. A measurement result on a bevel gear showed availability of our proposal. We are aiming to realize simultaneous measurement of the internal and external profiles.
In this paper, a novel modified Fourier transform method is proposed, which employs a fringe image and a flat image to eliminate the background and in the mean time facilitate the retrieval of the absolute phase map. Both the fringe and flat patterns are projected onto the object by a digital video projector. With the subtraction of the flat image from the fringe image, the background is completely removed and the spectrum overlapping in the frequency domain is prevented. The flat image is also employed for hole and shadow detection. Two cross-shaped markers are embedded in the flat and fringe image respectively for absolute phase retrieval. Experimental results showed that the proposed method produced better shape measurement results when measuring fast moving or changing objects, compared to the phase shifting method. The proposed method has the potential to boost the speed of our real-time 3-D shape measurement system to 120 fps with better measurement accuracy.
The field of 3D optical metrology has seen significant growth in the commercial market in recent years. The methods of using structured light to obtain 3D range data is well documented in the literature, and continues to be an area of development in universities. In some areas such as semi-conductor manufacturing, optical components, and some other specialized fields of precision metrology optical methods has gained acceptance as both a reference tool and in many cases part of the manufacturing processes. Such tools as flatness checkers, white light interference microscopes, and the new vibration insensitive interferometer systems have offered new opportunities to applications that once were limited to isolated laboratory systems. There are still many challenges to address to make 3D optical metrology attractive to the wider areas of manufacturing such as general machining, metal forming, and molding operations. This paper will briefly review some of the tools available today, where the opportunities may still lie, and what the gaps are that still remain to bridge the gap to wider spread use of the technology.
This paper describes preliminary development of a high-speed distance gage for manufacturing process control. The
objective of the system was to measure and record the distance from a tool/processing tip to the processed surface at a
frequency of 10 Kilohertz with minimal sensitivity of the device to tilt or curvature of the processed surface. This speed
is not achievable by use of a standard camera system or by typical position sensitive detectors (PSDs) due to data
processing and optical limitations. The proposed solution comprises a linescan camera system and a laser light source
positioned diagonally and about the nominal area of interest. In this setup, the line segment, which defines the range of
locations of the laser spot, is imaged onto the linescan sensor. The location of the image of the spot is proportional to the
location of the spot on the target object. The height relative to a reference tool position is then calculated geometrically.
This setup enables flexible analysis of spot location where a
multi-layered partly transparent surface is inspected, allows
removal of stray light reflections and handles different types of surface finish. A real-time image analysis is enabled
through the use of embedded technology.
Recently, X-ray computer tomography (XCT), as already established for medical applications has been utilized for
production metrology. The ability to perform non-destructive testing methods on prototypes provides a huge potential for
the reduction of inspection time for first sample release testing. While today's XCT devices focus on transilluminating
the object and generate 4D-voxeldata, the possibility to focus only on the edges of objects and therefore on the cast
shadow of the object on the detector to measure form tolerances is presented.
This paper will focus on edge detection algorithms as well as new methods to calibrate a soft X-ray projection system to
be able to measure form tolerances. In contrast to form measurement instruments which are based on visible light and a
telecentric optical path, new algorithms to calibrate the soft X-ray measurement instrument are being developed because
of the divergent ray distribution of the soft X-rays. The advantage of soft X-rays, compared to visible light, is that they
are not influenced by residues of the manufacturing process, e.g. cooling lubricant, which allows a robust analysis of the
outer object form. An algorithm for robust edge detection will be introduced and the correlation between the precision of
the edge algorithm and strategies developed to calibrate the shaft measurement instrument are shown, which is crucial to
reach the targeted low measurement uncertainty.
Range imagers provide useful information for part inspection, robot control, or human safety applications in industrial environments. However, some applications may require more information than range data from a single viewpoint. Therefore, multiple range images must be combined to create a three-dimensional representation of the
scene. Although simple in its principle, this operation is not straightforward to implement in industrial systems, since each range image is affected by noise. In this paper, we present two specific applications where merging of range images must be performed. We use the same processing pipeline for both applications : conversion from
range image to point clouds, elimination of degrees of freedom between different clouds, validation of the merged results. Nevertheless, each step in this pipeline requires dedicated algorithms for our example applications. The first application is high resolution inspection of large parts, where many range images are acquired sequentially and merged in a post-processing step, allowing to create a virtual model of the part observed, typically larger than the instrument's field of view. The key requirement in this application is high accuracy for the merging of multiple point clouds. The second application discussed is human safety in a human/robot environment: range images are used to ensure that no human is present in the robot’s zone of operation, and can trigger the robot's emergency shutdown when needed. In this case, range image merging is required to avoid uncertainties due to occlusions. The key requirement here is real-time operation, namely the merging operation should not introduce a significant latency in the data processing pipeline. For both application cases, the improvements brought by
merging multiple range images are clearly illustrated.
We propose a method to measure the overlay by choosing an optimal measurement design with a modified scatterometer
system. This method is capable of measuring zero and non-zero diffraction orders at theta (zenith) and phi (azimuth)
angles of incidence by carefully modulating the optical system. Thus a large quantity of angular scatterometry data can
be measured in a short period of time with no mechanical or vibrational movement. We used a rigorous diffraction
theory to model the measurement sensitivity using an overlay with two layer gratings at a fixed wavelength in the range
of the theta (zenith, 0° to 90°) and phi (azimuth, 0° to 180°) incident angles. We compared the measurement sensitivities
at theta and phi dependence. In addition, we compared the optical responses of zero order and first order diffractive
overlays. We propose a methodology to measure the overlay using overlay targets with two gratings, designed with an
intentional offset difference between the top and bottom gratings to maximize the measurement sensitivity and minimize
the response to the process noise.
In heavy industry, especially in the shipbuilding process, 3D profile measurement of large-scale hull pieces is needed for fabrication and assembly. Currently, using many kinds of templates made of wood or plastic still do an important role as a standard ruler. We suggest an efficient method of 3D profile measurement to obtain the xyz-coordinates of curvature plates. The measurement system comprises multiple line structured laser sources and performs profile measurement by projecting structured light source on the object surface. The measurement results show that measurement accuracy is within the boundary of accuracy required in the shipbuilding process.
The design and construction of next-generation telescopes involves the development of new technologies capable of
fabricating and testing large dimension mirrors with small f-numbers in order to get rapid surfaces. Secondary mirrors in
Cassegrain-type telescopes are convex and hyperbolic. When the dimensions are increased and its f-number is reduced,
the difficulty in testing the surface increases exponentially and traditional optical testing is no longer feasible. The
present study offers a technique developed to test a mould to be used in fabricating the secondary mirror of the Large
Millimeter Telescope (LTM). The mould is a hyperboloid surface with a conic constant of -1.1474, a paraxial curvature
radius of 1764.94 mm and a diameter of 1600 mm. Since the telescope will work within wavelengths ranging from 1 to 3
mm, surface errors must be less than 15 μm in rms. The mould was evaluated by measuring the coordinates of 53,824
points on its surface using an advanced coordinates measurement machine (CMM) at the National Institute of
Astrophysics, Optics and Electronics (INAOE) in Mexico. From this we obtained the shape of the conic surface that
better fits this distribution of points, using a Genetic Algorithms (GA) program developed for this purpose. Finally, the
results obtained are shown.