28 June 2013 Dual-wavelength digital holographic shape measurement using speckle movements and phase gradients
Author Affiliations +
A new method to measure shape by analyzing the speckle movements in images generated by numerical propagation from dual-wavelength holograms is presented. The relationship of the speckle movements at different focal distances is formulated, and it is shown how this carries information about the surface position as well as the local slope of the object. It is experimentally verified that dual-wavelength holography and numerically generated speckle images can be used together with digital speckle correlation to retrieve the object shape. From a measurement on a cylindrical test object, the method is demonstrated to have a random error in the order of a few micrometers.



In many industries, online control within the manufacturing process is needed to optimize quality and production efficiency. Conventional contact measurement methods are usually slow and invasive, which means that they cannot be used for soft materials and for complex shapes without influencing the controlled parts. An alternative is a noncontact measurement by optical methods like digital holography. With digital holography, high resolution and precise three-dimensional (3-D) images of the manufactured parts can be generated. This technique can also be used to capture data in a single exposure, which is important when doing measurements in a disturbed environment.1 Recently, a method has been suggested where knowledge of the object shape, e.g., a CAD model, is used to reduce the number of required images for shape measurement in digital holography.2,3 The main idea is to perform online process control of manufactured objects by measuring the shape and comparing it with the CAD model. The method presented in this paper is also intended for such use.

In digital holography, access to the complex wave field and the possibility to numerically reconstruct holograms in different planes4,5 introduce a new degree of flexibility to optical metrology. From a single-recorded hologram, images with amplitude and phase information for different focal distances can be calculated by numerical propagation of the optical field. By recording two or more holograms with different illumination wavelengths, a phase map can be determined for all these focal distances.5 These phase maps will correspond to a measurement with a synthetic6 or equivalent wavelength Λ, calculated as


where λ1 and λ2 are the wavelengths of the illumination. This type of holographic contouring is based on the extraction of phase by direct comparison of two or more speckle patterns, and hence is sensitive to speckle decorrelation and speckle movements. Phase retrieval7 is another well-known phase extraction technique for shape measurement. In this method, the intensity distribution is measured at different focal planes, and the transport of intensity equation is utilized to reconstruct the phase distribution.8,9 In contrast to digital holography, the technique needs no reference wave during the recording process but instead requires two or more exposures.

In a technique known as a digital speckle correlation, the analysis of speckle displacement by correlation techniques is used for measurements of, e.g., deformation, displacement, and strain. It has been shown that this technique provides an effective nondestructive, optical measurement, and characterization tool.1011.12.13 The idea is to utilize the speckle pattern that appears when a coherent beam illuminates a rough surface. A change of the wavelength or a deformation of the object or a change of the microstructural distribution of the surface will cause the speckle pattern to change.11,14 These changes may appear as speckle decorrelation, movements of the speckle pattern, and a change in the phase of the speckles. In general, all these changes appear simultaneously. Detailed information on how to calculate the correlation properties of dynamic speckles is given in Refs. 10, 11, and 14. A few years ago, Yamaguchi et al.15 used the peak position of the correlation value in reconstruction distance to calculate the shape of the object. The idea thus utilized is that speckles in a defocused plane tend to move when changes in the wavelength set-up are introduced. The correlation function is, however, broad in the depth direction, and accuracy with that approach is limited. In this paper, a similar approach is taken, but instead of calculating the shape from the maximum of the correlation function in depth direction, the shape gradient of the object is calculated from the speckle movements at two different focal planes caused by a change in wavelength.

In summary, our approach uses digital holography to measure the phase distribution of the light, and can then by postprocessing and numerical propagation generate the intensity distribution in as many different focal planes as necessary. By using image correlation and speckle movement, our method is also robust to large phase gradients and large movements within the intensity patterns. The advantage of our approach is that, using speckle movement, we reached the shape measurement even when the synthetic wavelength is out of dynamic range of the height.

In Sec. 2, it is shown how the speckle movement and wavelength shift relates to the angle of the local surface normal. In Sec. 3, experimental results are presented, and the object shape of a smooth object is determined by integration of the shape gradients. The technique is demonstrated by a measurement on a cylindrical object with a trace milled off.



Holograms can be recorded either with the reference wave in parallel with the object light or tilted at an angle. These arrangements are called in-line holography16 and off-axis holography,17,18 respectively. The in-line holography is often used to image and localize particles in microscopy, while the off-axis holography is used to simplify the signal processing and in situations where only a single exposure can be used. In case of digital holography, the off-axis geometry introduces a carrier to provide a simple way to filter out the information, and that is the technique used in this paper. Consider Fig. 1 where LD is a tunable-laser diode, R the path of the reference light, FP the focus plane, and object space (OP) the position of zero phase difference between reference and object light, respectively. EnP and ExP defines the entrance pupil and the exit pupil of the imaging system, respectively. Define Uo(x,y)=Ao(x,y)exp[iϕo(x,y)] as the object wave and Ur(x,y)=Ar(x,y)exp[iϕr(x,y)] as the reference wave in the detector plane. The recorded image can then be represented by


where it is assumed that Ar(x,y)=Ar is a constant and Δϕ=ϕo(x,y)ϕr(x,y) is the phase difference between the object and reference wave. The two first terms of Eq. (2) correspond to the zero-order term, and are slowly varying in time and space and are independent of the correlation between Uo and Ur. The last two terms are the interference terms, which are sensitive to the phase difference, Δϕ and can easily be filtered out in the Fourier domain19 in the off-axis arrangement.

Fig. 1

The sketch of the set-up that defines the variables. LD is a tunable laser diode, R the path of the reference light, FP the focus plane, and object space (OP) the position of zero phase deference between reference and object lights. EnP and ExP defines the entrance pupil and the exit pupil of the imaging system, respectively.


Considering the third term J(x,y)=Uo(x,y)Ur*(x,y)=ArAoexp(iΔϕ) in Eq. (2), a modified version, Ŭo(x,y), of the object wave is retrieved as


where Ŭr(x,y) only contains the variation in phase over the detector caused by the tilt angle and the curvature of the field. In that way, a reference plane is defined in OP where the phase difference between the object and reference waves is zero. The complex optical field Ŭo can then be used for numerical refocusing.4 In this process, it is important that a constant magnification is kept.

If two fields Ŭ1(x,y;λ1) and Ŭ2(x,y;λ2), recorded with different wavelengths are retrieved, it is important to know their correlation properties, which is the purpose of the coming section. Consider Fig. 2(a) and 2(b). A diffuse plane surface is illuminated from a monochromatic point source Ps(xs) located at xs [Fig. 2(a)]. If position x defines a general scattering point on the surface, the plane wave component that illuminates the scattering point will propagate in direction ss=(xxs)/Ls, where Ls=|xxs| is the length between the source and the scattering point, and the directional vector ss points from the source to the scattering point. The random wavelet contributions from a surface path Σ on the object surface produce a field detected in point Pp(xp) at position xp in front of the surface. It is assumed that Σ is much smaller than the illuminated surface, and that the point Pp is in a plane conjugate to a detection plane of the imaging system called focus plane, as shown in Fig. 2(b). We consider Io as a constant intensity on the surface, and sp=(xpx)/Lp as a directional vector from the scattering point toward the detection point, where Lp=|xpx| is the length between the scattering and the detection points. Thus, the total length passed by the wave is L=Ls+Lp, and the accumulated phase will be ϕ(k,x,xp,xs)=kL, where k=2π/λ is wave number. In the following, the vector m(x,xp,xs)=sp(x,xp)ss(x,xs) known as the sensitivity vector of a surface point will be of importance.

Fig. 2

Defining of vectors (a) and quantities (b) included in the derivation of dynamic speckle properties in free-space geometry.


The response of a speckle field due to a general change is given in Ref. 14. Here, we will only consider the response due to a change in wavelength. Though the absolute phase difference will be


where Δk is the changes in the wave number, xo is the center position in Σ, and L is the total difference in optical path between the object and the reference waves. In Fig. 1, this corresponds to twice the distance between OP and the object surface. The differential phase difference as the response due to a change only in wavelength is given by


where xε is an integration variable over Σ and Δx is a spatial separation in detection space, and


is the speckle movement in the focus plane.

Equation (6) calls for some clarifications. First of all, the speckle movement vector A is the projection of the speckle movement in the conjugate plane of the detector (perpendicular with the optical axis). The vector mΣ(x0,xp,xs) appearing in Eq. (6) is the projection of the sensitivity vector m onto the local surface patch Σ, and gives a vector that is perpendicular to the surface normal vector n. Hence, the speckle movement is related to the gradient of the surface. The magnitude of mΣ gives the magnitude with which the speckle movement is geared, and its direction gives the direction in which the speckles move. The scaling parameter cosθX^ relates the orientation of the detector to the surface patch, where θX^ is the angle between Δx and its projection ΔxΣ on to Σ.

If an experimental set-up is used such that m=spss2 z^, where z^ is a unit vector along the optical axis, and speckle movement in the image plane is considered, we, therefore, expect only surface variations in the horizontal plane. Then mΣ=msinθx^, and Eq. (6) is simplified to


where θ is the angle from the surface normal to the sensitivity vector and M is the positive magnification. In arriving at Eq. (7), it is assumed that object deformations are absent between the two recordings. If the defocus ΔL changes sign, the direction of the speckle movement also changes the sign.

Propagating the image to different planes causes different speckle movement results as a result of wavelength shift. Difference in speckle movement value ΔA for the different propagated planes can be calculated by Eq. (7). Then, ΔL will be the distance between two different propagated planes. If ΔA is multiplied by the scaling parameter k/(2MΔLΔk), it equals the local phase gradient at the object surface. By solving for θ, the surface normal and the slope of the object shape can be calculated from the speckle movements and change in wavelength.


Experiments and Results

Consider the experimental set-up14 seen in Fig. 1. A laser diode LD (SANYO) illuminates an object with local surface normal n along a direction ss. The temperature of the laser diode can be controlled in order to tune the wavelength. The object is a half cylinder with a diameter of 48 mm, which was cut to have a measurement object of height 15 mm, as shown in Fig. 3. The cylindrical object had a trace of depth 1 mm and a width of 10 mm milled off. The cylinder was manufactured with an accuracy of 10 μm. The object is imaged by an optical system along a direction sp onto a digital camera (Sony XCL 5005) with a resolution of 2456×2058 pixels, a pixel size of 3.45×3.45μm, a dynamic range of 12 bits, and an output frame rate of 15 Hz. Part of the light from the laser diode is decoupled and redirected toward the detector with an angle of 10deg with respect to the direction of the object light. From the detector plane the reference light is seen to originate from a point on the exit pupil plane outside the opening. The off-axis method is therefore utilized, and the complex amplitude field filtered out. Using this method, our numerical aperture is restricted to NA1<λ/8a, where a is the sampling pitch on the detector. The zero optical path difference plane OP between the object light and reference light is indicated in Fig. 1. FP is the plane in which the optical system is focused during the measurement. Only the section of 11.6×11.6mm2 indicated by a black square in Fig. 3 is measured, which includes both the trace and a cylindrical part. In these measurements, approximately collimated illumination is used, and the sensitivity vector is m=spss2 z^, where z^ is a unit vector along the optical axis of the imaging. Thus, constant phase is expected over planes perpendicular with the optical axis. The part of the section that includes the cylindrical shape has a surface normal that varies in the horizontal plane. Therefore, the speckle movement will vary only in the direction of the horizontal plane and change the sign on either side of the central direction.

Fig. 3

White light image of the object.


Two holograms, one with a temperature of the laser diode of 15°C corresponding to a wavelength of 647.0 nm and other with a temperature of 19°C corresponding to 649.4 nm are acquired with FP coinciding with the surface of the trace. Controlling the wavelengths by temperature changes are straight forward but slightly inaccurate and may vary a few tenths of a nanometer.

The complex amplitudes from the recorded holograms were acquired and used to calculate two more sets of complex amplitudes using numerical re-focusing. One field is set 10 mm behind the object and the other 10 mm in front of the object. We, hence, have the set {U1,U2,U10,U20,U1+,U2+} of six complex amplitudes acquired at two wavelengths (denoted 1/2) and originating from three planes (denoted /0/+), which will be used to acquire information about the shape of the object. The wrapped phase of Ŭ1*Ŭ2 in the three planes is shown in the upper row of Fig. 4. Moving from left to right, these phase maps relate to a plane behind the object, on the object, and in front of the object, respectively. It is worth mentioning that the position of the focus plane plays a crucial role in regions with a steep slope, while the quality of the fringes is reasonably unaffected by the position of the focus plane in regions where the sensitivity vector is roughly parallel with the surface normal. This is clear from the sudden degradation of the fringe contrast as a result of going from the trace out to the cylindrical part in the upper right part of the images.

Fig. 4

Results obtained from the modified mutual coherence function in different planes. The upper row shows the wrapped phase map, and the lower row shows calculated speckle displacements. The columns correspond to results obtained 10 mm behind the object, on the object, and 10 mm in front of the object, respectively.


The lower row of Fig. 4 shows the corresponding speckle movement obtained by calculating ΔI1ΔI2 using speckle correlation (digital speckle photography) in the three planes, respectively. At the part of the cylinder where the surface normal is parallel with the optical axis and on the trace part, the speckle movement is always close to zero. However, the speckle movement magnitude shows an increase toward the right when defocus is introduced. This result is in accordance with the theoretical relation given in Eq. (7). At the upper plane part and at the left part of the cylinder, the surface normal is (almost) parallel with the optical axis which gives a θ close to zero. Hence, A(x,xp,xs) will be close to zero as is seen in Fig. 4. We also note that the sign of the speckle motion changes as the focus plane changes the side of the object. In front of the object, the speckles move toward the left, while they move to the right behind the object. As shown by Yamaguchi et al.,15 this is an effect that may be utilized to locate the position of the object surface. The technique thus obtained is very similar to the technique of determining shape from the projection of a random pattern.20

To study the relation between the speckle movements and the focus distance ΔL, 20 more sets of complex fields were calculated by numerical re-focusing, in the range from 10 mm behind the object to 10 mm in front of the object. Speckle movements along a vertical line at x-position 4.7 mm (line a in Fig. 3) were calculated. The movements at the flat part and at the cylindrical part of the object were plotted versus ΔL (see Fig. 5). Note that the flat part with θ close to zero gives small movements. For the cylinder, the slope is approximately θ=10deg, and the theoretical speckle ovement was calculated from Eq. (7) and plotted as a line. As can be seen in Fig. 5, the experimental results match the theoretical line. The standard deviation of the experimentally measured speckle movements at the cylindrical part was less than 0.7 μm corresponding to 0.2 pixels. We may compare this result with the theoretical expression for the accuracy,21


where σ is the average speckle size, N is subimage size in one dimension, and γ the speckle correlation value. In our case σ4, N was 64, and the average correlation was γ=0.77, which gives the theoretical accuracy of 0.14 pixels in locating the speckle movement.

Fig. 5

Relationship between the speckle movements and the focus distance along a vertical line (line a in Fig. 3), at the cylindrical part of the object (dots), and along the flat part (crosses). The line is the theoretical speckle movement calculated from Eq. (7) for a surface slope of θ=10deg.


If ΔA is multiplied by the scaling parameter k/(2MΔLΔk), it equals the local phase gradient at the object surface. These phase gradients may be integrated, for example using cumulative trapezoidal numerical integration to get the phase of the measurement. As seen in Fig. 6(a), this results in an unwrapped phase map that equals the interferometric phase difference Δϕa up to a constant. Figure 6(b) shows the unwrapped phase obtained from the middle image of Fig. 4. For the unwrapping, the technique by Volkov and Zhu was used.22 As seen, the results are almost identical. These phase maps may then be transformed to shape by simple scaling.

Fig. 6

Unwrapped phase maps. (a) The phase over the surface calculated from defocused speckle movements, while (b) shows the results from unwrapping the middle image of Fig. 4.


Figure 7 shows the measured shape along the two rows that are defined by b and c in Fig. 3 (dashed at the flat part and line at the cylindrical part). The theoretical shape of the cylinder is plotted as well (dotted). It is seen that the measured shape corresponds to the theoretical except for a linear scaling factor. This scaling can be due to an unprecise reading of the wavelength or that the light was not precisely collimated. If the wavelength shift is adjusted with as little as 0.5 nm, the measurement matches the theoretical shape perfectly. For the proposed method, it is necessary to have either lasers with well-defined wavelengths or the possibility to measure the wavelengths accurately. To estimate the accuracy of the measurement, we plot the difference between the theoretical shape and the measured one with the adjusted wavelength (see Fig. 8). The standard deviation of this difference is 4.2 μm. In Fig. 9, a 3-D display of the measured object shape from the speckle movements is shown.

Fig. 7

Two measured profiles for horizontal line b and c in Fig. 3 are plotted (dashed at the flat part and line at the cylindrical part). For comparison, the theoretical shape of the cylinder is also plotted (dotted). As the positioning of the cylinder in the set-up was not so precise, it is difficult to compare the flat part with a theoretical shape.


Fig. 8

Difference between theoretical and measured shapes with the adjusted wavelength shift.


Fig. 9

Three-dimensional (3-D) plot of the measured shape from speckle movements.


The use of measured surface slopes to estimate shape is not unique to the method presented here. In fact, it may be compared with other methods that give the surface slope such as photometric stereo and deflectometric methods.2324.25 These are well known for precise measurement of small local shapes due to their derivative nature. By using a digital holographic recording and numerical propagation, the proposed technique only requires two images and no mechanical movements. As both interferometric data and the speckle movements are obtained from the same recording, these can be combined to achieve even better results.



Holographic contouring is a very precise measurement method, but is based on the extraction of phase by direct comparison of speckle patterns, and hence is sensitive to speckle decorrelation and speckle movements. In this paper, these speckle movements are utilized to calculate the shape. The theoretical relation between the object surface normal and the speckle movements have been presented and result in a linear relation between surface slope and defocus. It has also been experimentally shown how measurements of speckle movements can be used to calculate the phase distribution and the object shape. By using holographic recordings, the re-focusing can be done numerically and without any mechanical movements, which ideally means that only one recording needs to be acquired. From a measurement on a cylindrical test object, it was shown that the measurement accuracy is in the order of a few micrometers.


This research is financially supported by VINNOVA (the Swedish Governmental Agency for Innovation) and was a part of the HOLOPRO project. The authors would also like to thank Dr. Henrik Lycksam, Dr. Erik Olsson, and Dr. Per Gren for their valuable discussions.


1. M. Sjodahl, “Robust information extraction in interferometry: overview and applications,” in 2010 9th Euro-American Workshop on Information Optics (WIO), IEEE, Helsinki (2010). http://dx.doi.org/10.1109/WIO.2010.5582504 Google Scholar

2. P. Bergströmet al., “Shape verification using dual-wavelength holographic interferometry,” Opt. Eng. 50(10), 101503 (2011).OPEGAR0091-3286 http://dx.doi.org/10.1117/1.3572182 Google Scholar

3. P. BergströmS. RosendahlM. Sjödahl, “Shape verification aimed for manufacturing process control,” Opt. Lasers Eng. 49(3), 403–409 (2011).OLENDN0143-8166 http://dx.doi.org/10.1016/j.optlaseng.2010.11.009 Google Scholar

4. E. L. JohanssonL. BenckertM. Sjödahl, “Improving the quality of phase maps in phase object digital holographic interferometry by finding the right reconstruction distance,” Appl. Opt. 47(1), 1–8 (2008).APOPAI0003-6935 http://dx.doi.org/10.1364/AO.47.000001 Google Scholar

5. M. Sjödahlet al., “Depth-resolved measurement of phase gradients in a transient phase object field using pulsed digital holography,” Appl. Opt. 48(34) H31–H39 (2009).APOPAI0003-6935 http://dx.doi.org/10.1364/AO.48.000H31 Google Scholar

6. C. WagnerW. OstenS. Seebacher, “Direct shape measurement by digital wavefront reconstruction and multiwavelength contouring,” Opt. Eng. 39(1), 79–85 (2000).OPEGAR0091-3286 http://dx.doi.org/10.1117/1.602338 Google Scholar

7. J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21(15), 2758–2769 (1982).APOPAI0003-6935 http://dx.doi.org/10.1364/AO.21.002758 Google Scholar

8. M. Agouret al., “Shape measurement by means of phase retrieval using a spatial light modulator,” in Int. Conf. Advanced Phase Measurement Methods in Optics and Imaging, Monte Verita, Ascona (2010). http://dx.doi.org/10.1063/1.3426125 Google Scholar

9. P. F. Almoroet al., “3D shape measurement using deterministic phase retrieval and a partially developed speckle field,” Proc. SPIE 8384, 83840Q (2012).PSISDG0277-786X http://dx.doi.org/10.1117/12.919223 Google Scholar

10. M. Sjodahl, “Calculation of speckle displacement, decorrelation and object-point location in imaging systems,” Appl. Opt. 34(34), 7998–8010 (1995).APOPAI0003-6935 http://dx.doi.org/10.1364/AO.34.007998 Google Scholar

11. I. Yamaguchi, “Speckle displacement and decorrelation in the diffraction and image fields for small object deformation,” Opt. Acta 28(10), 1359–1376 (1981).OPACAT0030-3909 http://dx.doi.org/10.1080/713820454 Google Scholar

12. I. Yamaguchi, “Shape and deformation measurements by digital speckle correlation and digital holography,” in Proc. XIth Int. Congress and Exposition, Society for Experimental Mechanics Inc., Orlando, Florida (2008). Google Scholar

13. D. KhodadadE. J. HällstigM. Sjödahl, “Shape reconstruction using dual wavelength digital holography and speckle movements,” Proc. SPIE 8788, 87880I (2013).  http://dx.doi.org/10.1117/12.2020471.PSISDG0277-786X Google Scholar

14. M. SjödahlE. HällstigD. Khodadad, “Multi-spectral speckles: theory and applications,” Proc. SPIE 8413, 841306 (2012).  http://dx.doi.org/10.1117/12.981665.PSISDG0277-786X Google Scholar

15. I. YamaguchiT. IdaM. Yokota, “Measurement of surface shape and position by phase-shifting digital holography,” Strain 44(5), 349–356 (2008).STRACC0039-2103 http://dx.doi.org/10.1111/str.2008.44.issue-5 Google Scholar

16. D. Gabor, “A new microscopic principle,” Nature 161, 777–778 (1948).NATUAS0028-0836 http://dx.doi.org/10.1038/161777a0 Google Scholar

17. E. N. LeithJ. Upatnieks, “Reconstructed wavefronts and communication theory,” J. Opt. Soc. Am. A 52(10), 1123–1128 (1962).JOAOD60740-3232 http://dx.doi.org/10.1364/JOSA.52.001123 Google Scholar

18. E. N. LeithJ. Upatnieks, “Wavefront reconstruction with continuous-tone objects,” J. Opt. Soc. Am. 53(12), 1377–1381 (1963).JOSAAH0030-3941 http://dx.doi.org/10.1364/JOSA.53.001377 Google Scholar

19. T.-C. Poon, Digital Holography and Three-Dimensional Display: Principles and Applications, Springer, New York(2006). Google Scholar

20. M. A. SuttonJ. J. OrteuH. Schreier, Image Correlation for Shape, Motion and Deformation Measurements, Springer, New York (2009). Google Scholar

21. M. Sjödahl, “Accuracy in electronic speckle photography,” Appl. Opt. 36(13), 2875–2885 (1997).APOPAI0003-6935 http://dx.doi.org/10.1364/AO.36.002875 Google Scholar

22. V. V. VolkovY. Zhu, “Deterministic phase unwrapping in the presence of noise,” Opt. Lett. 28(22), 2156–2158 (2003).OPLEDP0146-9592 http://dx.doi.org/10.1364/OL.28.002156 Google Scholar

23. C. WagnerG. Häusler, “Information theoretical optimization for optical range sensors,” Appl. Opt. 42(27), 5418–5426 (2003).APOPAI0003-6935 http://dx.doi.org/10.1364/AO.42.005418 Google Scholar

24. J. Seewiget al., “Reconstruction of shape using gradient measuring optical systems,” in The 6th Int. Workshop on Advanced Optical Metrology, W. OstenM. Kujawinska, Eds., Springer, Berlin, Heidelberg (2009). Google Scholar

25. S. Ettlet al., “Fast and robust 3D shape reconstruction from gradient data,” in DGaO Proc. (2007). Google Scholar



Davood Khodadad received his BS and MS degrees in bioelectrical engineering from Sahand University of Technology, Tabriz, and Tehran University of Medical Sciences, Tehran, in 2008 and 2011, respectively. He is currently active as a PhD student in the Division of Experimental Mechanics at Luleå University of Technology, Sweden. His research interests include noncontact optical metrology, imaging and image formation, and signal and image processing. His research is currently focused on development of pulsed multispectral digital holography for three-dimensional imaging.


Emil Hällstig received his PhD in physics engineering at Uppsala University, Sweden, in 2004. The work was done at the Swedish Defence Research Agency (FOI), and included active optics and especially nonmechanical laser beam steering for a novel free-space optical link. He has, since 2000, also worked at the company Optronic as optical specialist and project manager, and for the last 10 years has been responsible for the research activities at Optronic. He also has held a position as guest researcher at Luleå University of Technology since 2004, and the research focuses on optical metrology and digital holography.


Mikael Sjödahl received his MSc in mechanical engineering and his PhD in experimental mechanics from the Lulea University of Technology, Sweden, in 1989 and 1995, respectively. He is currently holding the chair of experimental mechanics at the Lulea University of Technology and a professorship at University West, Sweden. He has authored or coauthored over 100 papers in international journals and contributed to two books. His interests include fundamental speckle behavior, coherent optical metrology, nondestructive testing, and multidimensional signal processing.

© The Authors. Published by SPIE under a Creative Commons Attribution 3.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Davood Khodadad, Davood Khodadad, Emil J. Hallstig, Emil J. Hallstig, Mikael Sjödahl, Mikael Sjödahl, } "Dual-wavelength digital holographic shape measurement using speckle movements and phase gradients," Optical Engineering 52(10), 101912 (28 June 2013). https://doi.org/10.1117/1.OE.52.10.101912 . Submission:

Back to Top