Open Access Paper
17 September 2019 Measurement accuracy and dependence on external influences of the iPhone X TrueDepth sensor
Author Affiliations +
Proceedings Volume 11144, Photonics and Education in Measurement Science 2019; 1114407 (2019) https://doi.org/10.1117/12.2530544
Event: Joint TC1 - TC2 International Symposium on Photonics and Education in Measurement Science 2019, 2019, Jena, Germany
Abstract
Depth sensors for three-dimensional object acquisition are widely used and available in many different sizes and weight classes. The measuring method used and the measuring accuracy depend on the task to be performed. The integration of depth sensors in mobile devices such as tablets and smartphones is largely new. The TrueDepth system of the iPhone X shows which measurement accuracies can be achieved with these systems and which areas of application can be achieved in addition to consumer fun. The investigations show that the TrueDepth system of the iPhone X can be used for measuring tasks with accuracies in the millimeter range.

1.

INTRODUCTION

In the past years and decades, industrial 3D measuring systems have reached a state of development with which a multitude of different measuring tasks can be realized precisely and at high speed. However, systems with these quality features are also expensive and, due to their often complex handling, difficult for non-scientists or non-measuring technicians to access in everyday use. This changed in November 2010, when Microsoft launched the Kinect system with the “Xbox 360” game console.1 Although Microsoft Kinect’s distance-measuring sensors were originally designed exclusively for consumer entertainment, they were also used outside the living room. Reasons for this are the attractive price of approximately 100 EUR, the numerous functions regarding depth measurement and above all the operability even without expert knowledge. The second generation of the Microsoft Kinect came onto the market in 2013 and, unlike its predecessor, works according to the time-of-flight method, one of the well-known methods of 3D metrology.

November 2016 saw the release of the next major innovation: Lenovo’s PHAB2 Pro, the first smartphone to be equipped with Google’s Tango Augmented Reality platform. Augmented Reality includes the ability to place and interact with virtual objects in the real world, or to create a three-dimensional model of rooms, for example. Just like the Kinect v2, the PHAB2 Pro works according to the time-of-flight method, which has created new implementation possibilities for optical 3D metrology.2 Due to too little integration of the Google Tango system in only two devices and extremely low download numbers of associated apps, the system was already discontinued in autumn 2017.3

In the fall of 2018, Apple then released the iPhone X, a smartphone that can also provide 3D data without the operator’s measurement experience with the help of the so-called TrueDepth sensor system based on the structured light principle.

2.

IPHONE X TRUEDEPTH SENSOR SYSTEM

The functionality of the camera system (cf. Figure 1) is comparable to that of the Kinect camera system version 1 from Microsoft. A point projector projects more than 30,000 dots onto a surface in front of the device. The infrared camera then receives the reflection of these light points and can create a model of the surface from the resulting pattern. To ensure accurate detection even in poor lighting conditions, the system is supported by an infrared radiator, which also illuminates the area in front of the camera.

Figure 1.

Components of the so-called ‘notch’ of the iPhone X.4 The TrueDepth system use first the Flood Illuminator to beams infrared light to establish the presence of a face, second the Dot projector to flashes 30,000 dots onto the object surface and finally the Infrared camera to read out the data.5

00015_PSISDG11144_1114407_page_2_1.jpg

The main function of the TrueDepth camera is facial recognition and thus the ability to unlock the smartphone, have it identified or make cashless payments. In addition, the system is used by various photo apps to place certain image areas in the foreground. The aim is to investigate to what extent the camera system can also be used in the field of measurement technology. For this purpose, however, it must first be clarified how the image information for the depth values can be extracted as raw data.

For the Apple Developer Program,6 Xcode sources,7 various frameworks, programming guides and sample codes are available. One of these frameworks is AVFoundation.8 AVFoundation provides a framework for time-based audiovisual media for iOS, macOS, watchOS and tvOS.

AVFoundation also provides an example code to display the depth data in the live image and to record images with this data. This sample code is called “AVCamPhotoFilter: Using AVFoundation to Capture photos with image processing”.9 Via Xcode the source code can be integrated and simulated on the smartphone. In this way, a camera app can be obtained with which a so-called depth value can be set. This value is a measure of how far away a certain object is from the camera. A slider at the bottom of the display can be used to set the extent to which only depth values are to be coded as gray values or also the color values are to be displayed. This app works in conjunction with the DualCam on the back of the iPhone X, but not with the TrueDepth camera on the front. If the front camera is to be used, the source text must be adapted or extended accordingly.

The first step is to integrate the TrueDepth camera into the mentioned app. This is done with the command .builtInTrueDepthCamera under the tab “CameraViewController”. Since the basic functions for displaying a depth map are available, all that remains is to set the app to access the TrueDepth camera instead of the standard front camera. For the experiments performed within the scope of this work, the app was set using the let defaultVideoDevice: AVCaptureDevice? command so that it starts with the TrueDepth camera with the option to display depth values. If the app extended in this way is started, it is now also possible for the front camera to display the depth values and take corresponding photos.

3.

EXPERIMENTAL SETUP

The aim of the investigations is to determine a dependency between the measuring distance between sensor and object and the raw distance data of the iPhone X in the form of gray values. Based on the specifications of the system this is realized for measuring distances between 140 and 1,000 millimeters. Existing influences of the object surface, such as roughness, reflectivity and color, are investigated by using different measurement objects. For the present sensor system, a cardboard box with a matte, beige-colored surface, which is not completely homogeneous in color due to the production process, can be classified as benign with regard to its measuring capability. On the other hand, a sheet of paper with a completely white and smooth surface is used. A checkerboard pattern consisting exclusively of white and black squares with a respective edge length of 20mm is used to determine the influence of large texture variations, as it is used, for example, in the calibration processes of multi-camera systems.10 This checkerboard can also be classified as slightly glossy.

In addition to the various object distances parallel to the sensor plane, the angle dependence is also examined for the reliability of the distance measurement values. For this purpose, a smooth filled wall with a wax coating serves as a sufficiently large measuring object with a suitable flatness. The angle variation ranges from a parallel arrangement with 0° in 10° steps to a very flat viewing angle of 80°.

The evaluation takes place for each recording in five areas: Four areas in the respective corners of the displayed object or object section and one area in the center of the image or object. The area size corresponds to 20 percent of the image dimensions or 20 percent of the corresponding image area if the measurement object does not fill the entire image area. As an example, this is shown for a distance image of the target cardboard box in Figure 2, which does not completely fill the complete field of view. The described corner areas are marked black, green, orange, yellow and gray. In addition, results are determined for the entire object area (large white rectangle).

Figure 2.

Areas of interest (AOI), exemplary shown for a distance image of the target cardboard box: Top left (black), top right (green), bottom left (orange), bottom right (yellow), center (gray). In addition, results are determined for the entire object area (large white rectangle).

00015_PSISDG11144_1114407_page_3_1.jpg

4.

RESULTS

For the first investigations, possible interferences such as reflections from a glossy surface, strong variations in the surface texture and penetration behaviour into the object to be measured are to be avoided. For this reason, the cardboard box is used first and the TrueDepth sensor of the iPhone X is compared in parallel.

All the gray values visualized and described below reflect the mean value of the corresponding evaluation area. The evaluation itself was carried out with the open-access image processing program ImageJ.11

Figure 3 shows on the left hand side the evaluation of the measurements for the surface of the cardboard box in the distances 140 – 1,000 mm. It can be seen that at a distance of 140 mm the gray value is only 4. This means that the working distance of the sensor has not yet been maintained and that usable results can only be achieved at a distance of 150mm and larger. If the measuring distance is below this value, no depth values are measured and the camera does not perform as intended. For a working distance of 150 mm, the distance values determined by iPhone X correspond to an average of 221 gray values. The further away the object is from the TrueDepth sensor, the lower the gray values are and the image appears darker. For the maximum working distance of 1 meter examined, the average gray value is 39.477 for the entire image.

Figure 3.

Gray values as raw data for different measurement distances between the TrueDepth sensor of the iPhone X and the object cardboard box (left) respectively the object white paper (right). The mean values for the five evaluation areas as well as the entire image or the entire object area (cf. section 3) are displayed in the figures.

00015_PSISDG11144_1114407_page_4_1.jpg

If one considers the variance between the mean values of the five evaluation areas, which each cover exclusively one twenty-fifth of the image, relatively high values of σ2 > 1.5 result up to a working distance of 300 mm. Distances greater than 300 mm result in a gray value variance of 0.556 > σ2 > 0.002. Explanations for this abrupt difference are, on the one hand, the principle of conditionally more reliable measurement for larger distances and, on the other hand, the fact that the cardboard box target only completely covers the field of view up to 280 mm. This means that the sensor corners from a working distance of 300 mm and larger are only partially included in the evaluation, if at all.

Since a minimum working distance of the TrueDepth sensor of 150 mm was determined during the measurement of the cardboard box, the objects to be measured are positioned exclusively from a working distance of 150 mm during the further course of the investigations. Otherwise, it can be assumed that very similar results will be obtained for a white sheet of paper as for the cardboard box. This is especially not the case for the distance of 150 mm. At this distance it can be seen (cf. Figure 4 at the left) that the camera shows a black shadow from the bottom to the center of the distance image. This is reflected in the results in Figure 3 on the right for the center evaluation area and the overall image. A high variance of the mean values from the different evaluation areas of σ2 > 5 results up to a working distance of 260 mm. Only for larger distances can the individual results and the variances be compared with those of the cardboard box.

Figure 4.

Measurement problems in the lower and central image parts for distances near the minimum working distance: White paper at 150 mm (left) and the object checkerboard at 160mm (right).

00015_PSISDG11144_1114407_page_4_2.jpg

The distance investigations for the checkerboard pattern show a similar phenomenon as for the white sheet of paper. For distances between the checkerboard pattern and the sensor of 150 to 170 mm, the black squares are displayed especially in the lower image area and in the middle and no measuring distance is determined (see Figure 4 at the right). This problem is comparable with the results of time-of-flight sensors from the 2000s, where different distances were determined depending on the texture.12 Thus, two different result levels were obtained for a plane with checkerboard texture: One for the white areas and another for the black ones. From a measuring distance of 180 mm, the surface texture has only a minor influence on the results of the TrueDepth sensor. However, as it is shown in Figure 5, the variance over the mean gray values from the five evaluation areas reaches the σ2 < 5 limit even later than with the white sheet of paper. This is the case from a measurement distance of 400 mm; from 500 mm the σ2 value is below 1.

Figure 5.

Equivalent representation to the results in Figure 3, here for the measurement object checkerboard pattern. The individual grey values in turn correspond to the respective mean values of the five evaluation ranges as well as to the entire image or the displayed measurement object (cf. section 3).

00015_PSISDG11144_1114407_page_5_1.jpg

Starting from the similar but not identical results for the three measurement objects cardboard box, white sheet paper and checkerboard pattern, the environmental conditions for the measurement of the cardboard box are varied. Four scenarios of ambient lighting are examined:

  • Measuring room illuminated with sunlight

  • Measuring room without sunlight and without room lighting

  • Measuring room exclusively with room lighting from behind

  • Measuring room exclusively with room lighting from the front

The results are presented in Figure 6. The presence of sunlight has the greatest influence on the distance measurement, as the corresponding wavelengths used by the TrueDepth system for the measurement are proportionately present in the sunlight. The room lighting, whether from the front or behind the measurement object, has hardly any influence on the depth data. The maximum distance difference determined in gray values is 5.484. According to the empirically determined function between gray values and depth values for the measurement object cardboard box (cf. Figure 3: y = 42,288 · x−1.009), this corresponds to a maximum measurement error of 24,39 mm. In relation to the nominal distance of 480 mm, this results in a maximum error of 5%.

Figure 6.

Distance measurements of the object cardboard box for different ambient / lighting conditions. The choice of the target distances is based on reliable measurement results, which could be achieved with the cardboard box in previous investigations, e.g. for the range 440 to 480 mm.

00015_PSISDG11144_1114407_page_5_2.jpg

All investigations carried out and discussed so far were based on a parallel arrangement between the object being measured and the TrueDepth sensor of the iPhone X. In the following, the angle dependence of the depth measurements will be investigated. For this purpose, the iPhone X is positioned on a tripod and rotated about its own axis. For target/actual comparisons, it should be noted that the actual measurement sensor is located 6.5 cm next to the axis of rotation.

Even for large viewing angles of up to 80°, as a sufficiently large object to be measured, a smooth-fill wall with a wax coating is used. The iPhone X is positioned horizontally on a tripod with a base distance of 150 mm and rotated around its own axis. Note that the TrueDepth sensor is located 6.5 cm to the left of the rotation axis. In general, the left side of the device, and thus also the image side (pixel row 0), is positioned closest to the measurement object, while the right side of the device or image side (pixel row 3,087) is furthest away. For angles greater than or equal to 60°, the wall no longer covers the entire sensor field of view, so that the results were determined exclusively for the object area. As it is shown in Figure 7, the target result of linear measured value curves between the left and right image areas can be largely maintained. There are only a few outliers with a maximum error of 5 gray values. A dependency between rotation angle and linearity deviations cannot be determined.

Figure 7.

Depth values (results in gray values) over the image width of the TrueDepth sensor for different viewing angles (0°, 10°, …, 80°) of the measuring object flat wall at a base distance of 150mm. Pixel 0 is the area closest to the object, pixel 3,087 is the farthest from the wall.

00015_PSISDG11144_1114407_page_6_1.jpg

5.

CONCLUSIONS

The TrueDepth sensor of the iPhone X is suitable for distance measurements according to the investigations carried out and the results obtained. The errors are in the millimeter range and are a maximum of 5 % of the target distance. The measurement error is significantly lower depending on the surface condition of the object under consideration. In principle, a higher measuring accuracy and lower variance can be achieved for larger measuring distances. For benign surfaces (matt, slightly textured) a stable measurement is already possible from 300 mm, for slightly shiny surfaces and/or intensive textures this minimum distance increases to up to 500 mm. The viewing angle of the TrueDepth sensor in relation to the object surface has no (negative) influence on the measurement results. Here, too, deviations of up to a maximum of 5% of the target distances are given. For higher precision, it is generally recommended that the measurement setup be protected from daylight, as there are dependencies between the active and passive scene lighting due to partially identical wavelength ranges.

For the texture-dependent minimum working distance of 150 to 170 mm, the TrueDepth sensor system of the iPhone X enables the recording of object details with a minimum extension of approximately 0.1 mm. In combination with the depth measurement accuracy described, the sensor can only be used for macroscopic measurement tasks. It is recommended to use a combination of a TrueDepth sensor for a rough scan of the scene and a precision system for small measuring fields or volumes. The result of the coarse scan is used to position the precision system. Alternatively, the TrueDepth sensor can be used exclusively to determine the existence of objects and/or their rough localization.

REFERENCES

[1] 

mirror2image, “How Kinect depth sensor works stereo triangulation?.,” (2010) https://mirror2image.wordpress.com/2010/11/30/how-kinect-works-stereo-triangulation/ Google Scholar

[2] 

Breitbarth, A., Nguyen, M., Dittrich, P.-G., and Notni, G., “Messtechnische Evaluierung eines Google-Tango-Systems nach VDI 2634 im Vergleich zur Kinect2,” Photogrammetrie, Laserscanning, Optische 3D-Messtechnik - Beiträge der Oldenburger 3D-Tage 2018, 17 104 –113 Wichmann(Feb.2018). Google Scholar

[3] 

Janssen, J.-K., “Googles Augmented Reality: Tango ist tot, es lebe ARCore.,” (2017) https://www.heise.de/newsticker/meldung/Googles-Augmented-Reality-Tango-ist-tot-es-lebe-ARCore-3817226.html Google Scholar

[4] 

[5] 

Oestreich, N., “Bloomberg: iPhone-Rückseite zukünftig mit 3D-Sensor - iphone-ticker.de.,” (2019) https://www.iphone-ticker.de/bloomberg-iphone-rueckseite-zukuenftig-mit-3d-sensor-119116/ Google Scholar

[6] 

Apple Inc., “Apple Developer.,” (2018) https://developer.apple.com Google Scholar

[7] 

Apple Inc., “Xcode - Apple Developer.,” (2018) https://developer.apple.com/xcode/ Google Scholar

[8] 

Apple Inc., “AVFoundation - Apple Developer.,” (2018) https://developer.apple.com/av-foundation/ Google Scholar

[9] 

Apple Inc., “AVCamFilter: Applying Filters to a Capture Stream — Apple Developer Documentation.,” (2018) https://developer.apple.com/library/content/samplecode/AVCamPhotoFilter/Introduction/Intro.html Google Scholar

[10] 

Zhang, Z., “Flexible Camera Calibration By Viewing a Plane From Unknown Orientations,” in Proceedings of the Seventh IEEE International Conference on Computer Vision, 666 –673 (1999). Google Scholar

[11] 

Rasband, W., “ImageJ.,” (2019) https://imagej.nih.gov/ij/ Google Scholar

[12] 

Munkelt, C., Trummer, M., Kuehmstedt, P., Denzler, J., and Notni, G., View Planning for 3D Reconstruction using Time-of-Flight Camera Data as a-priori Information, 1 –6 Springer Berlin Heidelberg(2009). Google Scholar
© (2019) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Andreas Breitbarth, Timothy Schardt, Cosima Kind, Julia Brinkmann, Paul-Gerald Dittrich, and Gunther Notni "Measurement accuracy and dependence on external influences of the iPhone X TrueDepth sensor", Proc. SPIE 11144, Photonics and Education in Measurement Science 2019, 1114407 (17 September 2019); https://doi.org/10.1117/12.2530544
Lens.org Logo
CITATIONS
Cited by 4 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Sensors

Cameras

Distance measurement

Light sources and illumination

Imaging systems

3D metrology

Infrared cameras

Back to Top