## Abstract

This paper presents a novel method for absolute three-dimensional (3D) shape measurement that does not require conventional temporal phase unwrapping. Our proposed method uses a known object (i.e., a ping-pong ball) to provide cues for absolute phase unwrapping. During the measurement, the ping-pong ball is positioned to be close to the nearest point from the scene to the camera. We first segment ping-pong ball and spatially unwrap its phase, and then determine the integer multiple of 2*π* to be added such that the recovered shape matches its actual geometry. The nearest point of the ball provides *z _{min}* to generate the minimum phase Φ

*that is then used to unwrap phase of the entire scene pixel by pixel. Experiments demonstrated that only three phase-shifted fringe patterns are required to measure absolute shapes of objects moving along depth*

_{min}*z*direction.

© 2017 Optical Society of America

## 1. Introduction

High-speed three-dimensional (3D) shape measurement is increasingly used due to the ever-growing computation capabilities of personal computers and even mobile devices nowadays. Digital fringe projection (DFP) technique is one of the most popular methods for high-resolution and high-speed 3D shape measurement because of its flexible fringe generation nature [1]; and phase-shifting algorithms are extensively used due to its accuracy, speed, and robustness to noise. However, a phase-shifting algorithm typically only provides phase values ranging from −*π* to +*π* [2]. To perform 3D shape measurement, 2*π* discontinuities have to be removed by employing a phase unwrapping algorithm.

Numerous phase unwrapping algorithms have been developed over the years. In general, conventional phase unwrapping methods can be classified into two categories: spatial phase unwrapping and temporal phase unwrapping. The former methods unwrap the phase of a point by referring to phase values of other points through optimization, under the assumption that surface geometry is smooth. The book edited by Ghiglia and Pritt [3] summarized a number of spatial phase unwrapping methods; Su and Chen [4] reviewed a number of robust quality-guided spatial phase unwrapping algorithms, and Zhao et al. [5] compared different strategies of generating a quality map to guide a phase unwrapping path. Despite the robustness improvements of recently developed spatial phase unwrapping algorithms, they are all fundamentally limited to measure *smooth* surfaces: the object has to be smooth on at least one path such that the object surface geometry will not introduce more than *π* phase changes between two successive points. In general, spatial phase unwrapping only provides a relative phase map that depends on the starting point of the unwrapping process. As a result, only relative 3D geometry can be recovered.

Temporal phase unwrapping method, in contrast, can retrieve absolute phase by acquiring additional images at a different time. A typical temporal phase unwrapping algorithm unwraps the phase of a point without using its neighboring phase values. Two and multi-wavelength phase-shifting algorithms [6–8] have been developed for laser interferometry, and hybrid binary coding with phase-shifting methods have been developed for DFP systems [9, 10]. They all work well and are extensively employed. However, the requirement of additional images slows down the entire measurement speed.

The nonconventional absolute phase retrieval methods include adding more camera(s) or projector(s) to capture or project from more than two perspectives such that the geometric constraints of the multi-view system can be used to determine absolute phase for each point through optimization [11–17]; or performing hybrid stereo-vision for coarse measurement and using the wrapped phase constraint for refinement or high-resolution measurement [18]. These methods can work well if the computational framework is properly developed. However, due to the requirement of more than two perspectives, the overall measurement area is reduced: the point has to be seen from all perspectives. Furthermore, the complexity and cost of such a hardware system increase due to the use of additional hardware components.

In short, spatial phase unwrapping methods do not require additional image acquisition or hardware, yet they cannot recover absolute phase; the temporal phase unwrapping methods sacrifice the measurement speed for absolute phase recovery; and multi-view geometry based approaches can recover absolute phase without acquiring additional images, yet they increase system cost and complexity. To address the limitations associated with all existing absolute phase recovery methods, An et al. [19] developed a geometric-constraint based phase unwrapping method that only requires a single camera and a single projector without the need of capturing additional images. This approach sets a distance *z _{min}* from the nearest point of object to the camera to generate a minimum phase Φ

*using the geometric constraints of the DFP system. Φ*

_{min}*is then used to unwrap phase pixel by pixel; and details of this phase unwrapping method will be presented in Sec. 2.2. However, this method is constrained to measure a limited depth range of objects. To maximize sensing depth range, it is desirable to precisely know the closest depth*

_{min}*z*value from the object to camera, making it difficult to be employed for applications where the object moves along the

_{min}*z*direction during measurement.

This paper presents a novel method for absolute phase measurement to drastically enhance the method developed by An et al. [19] by relaxing the constraint of determining *z _{min}* before measurement. Our proposed method uses a known object (i.e., a ping-pong ball with a 20 mm radius) to provide cues for dynamical

*z*determination. In particular, during the measurement, the ping-pong ball is captured with objects to be measured, and the ping-pong ball is positioned to be close to the nearest point of an object to camera. We segment the ping-pong ball and spatially unwrap its phase. Then we employ an optimization algorithm to determine the integer multiple of 2

_{min}*π*’s to be added such that the recovered geometry matches its actual shape. The nearest point of the ball provides

*z*to generate a minimum phase map Φ

_{min}*that can be further used to unwrap phase pixel by pixel for 3D absolute shape reconstruction. If the ping-pong ball is moving with the object,*

_{min}*z*can be determined dynamically for moving object measurement. Experiments demonstrated that only three phase-shifted fringe patterns are required to measure an absolute shape of a dynamic object.

_{min}## 2. Principles

#### 2.1. Three-step phase-shifting algorithm

Phase-shifting algorithms are extensively used in optical metrology because of its accuracy, speed, and robustness to noise. Over years, numerous phase shifting algorithms have been developed with some being more robust than others [2]. For high-speed applications, a three-step phase-shifting algorithm is typically used since it uses the minimum number of patterns to recover the phase pixel by pixel. For a three-step phase-shifting algorithm with equal phase shifts, three fringe images can be described as

where*I*(

_{t}*x*,

*y*) is the average intensity,

*I*″ (

*x*,

*y*) is the intensity modulation, and

*ϕ*is the phase to be solved for. Solving Eqs. (1)–(3) simultaneously leads to

*I*(

_{t}*x*,

*y*) is also the texture, or a photograph of the object without fringe stripes. The phase obtained from Eq. (5) ranges from −

*π*to

*π*with 2

*π*discontinuities, and this phase is called wrapped phase. The process of removing 2

*π*discontinuities and obtaining a continuous phase map is called phase unwrapping. Phase unwrapping is to determine 2

*π*discontinuous locations, find the integer number

*k*(

*x*,

*y*) of 2

*π*’s to be added to wrapped phase,

*ϕ*(

*x*,

*y*). Mathematically, the relationship between the wrapped phase and the unwrapped phase Φ(

*x*,

*y*) can be described as

Here *k*(*x*, *y*) is an integer number and is often called *fringe order*. If the fringe order *k*(*x*, *y*) can be uniquely determined based on a pre-defined value, then the unwrapped phase Φ(*x*, *y*) is regarded as an *absolute* phase. As discussed in Sec. 1, fringe orders *k*(*x*, *y*) determined by spatial phase unwrapping methods are typically relative to one point on the phase map, and thus spatial methods cannot give an absolute phase. In contrast, temporal phase unwrapping methods determine absolute fringe orders *k*(*x*, *y*) by referring to additional acquired information (e.g., additional images from the same or different perspectives), and that can lead to an absolute phase.

#### 2.2. Geometric constraint-based phase unwrapping algorithm

An et al. [19] developed the absolute fringe order determination method using inherent geometric constraints of a standard DFP system (i.e., a single camera and a singe projector). For simplicity, let us assume both camera and projector are calibrated under the same coordinate system based on a linear pinhole model. The projections from 3D world coordinates to 2D sensor planes can be described as,

**P**denotes a 3 × 4 projection matrix from world coordinates (

*x*,

^{w}*y*,

^{w}*z*) to 2D image coordinates (

^{w}*u*,

*v*), superscript

**represents projector, superscript**

^{p}**represents camera, and**

^{c}*denotes the transpose operation of a matrix. The projection matrices*

^{t}**P**and

^{c}**P**can be estimated by a standard structured light system calibration method [20]. Once

^{p}**P**and

^{c}**P**are determined, Equations (7)–(8) become 6 equations with 7 unknowns (

^{p}*s*,

^{c}*s*,

^{p}*x*,

^{w}*y*,

^{w}*z*,

^{w}*u*,

^{p}*v*) for a camera pixel (

^{p}*u*,

^{c}*v*). Thus only one additional constraint equation is needed to solve all unknowns uniquely. For 3D shape measurement, the absolute phase Φ(

^{c}*x*,

*y*) can provide that necessary constraint to calculate (

*x*,

^{w}*y*,

^{w}*z*) coordinates.

^{w}Alternatively, if we define a virtual plane at *z* = *z ^{w}*, each camera pixel (

*u*,

^{c}*v*) can have a unique corresponding projector pixel coordinates (

^{c}*u*,

^{p}*v*) by solving Eqs. (7)–(8). The corresponding phase can be determined by referring to (

^{p}*u*,

^{p}*v*) on the projector sensor plane. Therefore, for a given plane with constant

^{p}*z*, an entire phase map of the camera image can be created for that virtual plane. If

*z*=

*z*, we call the virtually created phase map the minimum phase map, or Φ

_{min}*. Apparently, Φ*

_{min}*is a function of the distance*

_{min}*z*, fringe period

_{min}*T*, and projection matrices

**P**and

^{c}**P**,

^{p}Figure (1) illustrates the basic idea of using the minimum phase for phase unwrapping. Assume the red dashed window shows the mapped projector region that the camera captures if an ideal plane is positioned at *z* = *z _{min}*. Since Φ

*is defined on the projector space, it is continuous without requiring any phase unwrapping, as shown in Fig. 1(b). For a plane at a depth*

_{min}*z*>

*z*, the wrapped phase

_{min}*ϕ*, captured by the camera, is mapped to the region on the projector marked as the solid blue window. Figure 1(c) shows the cross sections of these phase maps. Clearly, if

*ϕ*< Φ

*, 2*

_{min}*π*should be added to unwrap the phase

*ϕ*. An et al. [19] showed that the fringe order

*k*(

*x*,

*y*) can be determined by

Here, *ceil*[] is an operator that gives the nearest upper integer value.

#### 2.3. Proposed absolute phase recovery method

The geometric constraint-based phase unwrapping method discussed in Sec. 2.2 has two major advantages [21]: 1) it does not require any additional image acquisition, and thus it is more suitable for high-speed applications; and 2) it is robust to noise because it determines fringe order by referring to an artificially generated ideal absolute phase map Φ* _{min}* that is noise free.

However, this phase unwrapping method has the major limitation of confined measurement depth range. The maximum depth range is approximately

where*θ*is the angle between projection direction and capture direction, and ∆

*T*is the spatial span of one fringe period on the object space. This constraint limits the furthest measurable distance away from the plane at

_{s}*z*that the approach is applicable. To maximize depth sensing range,

_{min}*z*should be the closest point on the object surface, which is usually not precisely known before measurement. One practice that we use is to estimate

_{min}*z*by rough measurement. This practice can work if the relative position between the object and the DFP system does not change. Unfortunately, for dynamically moving objects, it will be challenging for this geometric constraint based phase unwrapping method.

_{min}To address this problem, we proposed a novel approach to dynamically determine proper *z _{min}* value by simultaneously capturing a known-size object (e.g., a ping-pong ball for our case) that is moving along with the object.

Figure 2 explains the pipeline of our proposed absolute phase unwrapping method. Three phase-shifted fringe images are captured by the camera, and they include the object(s) to be measured along with a known size object (e.g., a ping-pong ball). Wrapped phase *ϕ*(*x*, *y*) can be obtained by applying Eq. (5), and the texture image *I _{t}* (

*x*,

*y*) can also be extracted by Eq. (4). We then employ an image segmentation algorithm to detect the ping-pong ball from the texture image and separate the ping-pong ball from the rest scene. Since the texture image and the wrapped phase map come from the same phase-shifted fringe patterns, the corresponding wrapped phase

*ϕ*(

_{b}*x*,

*y*) for the ping-pong ball can also been extracted. Because the ping-pong ball surface is smooth, we apply a spatial phase unwrapping algorithm [22] to unwrap the phase. Here we can only obtain a relative unwrapped phase ${\mathrm{\Phi}}_{b}^{r}(x,y)$ for the ping-pong ball. This spatially unwrapped phase could have an offset

*k*

_{0}× 2

*π*(

*k*

_{0}is an integer for the entire phase map) from the absolute phase, i.e.,

To extract the constant number *k*_{0} for the ping-pong ball, we developed an optimization approach. From the projected fringe patterns, we know that *k*_{0} is bounded by

*N*is the total number of fringe stripes used. Then we search the entire domain to determine the actual value of

*k*

_{0}. For each

*k*∈ [0,

*N*], we reconstruct a 3D shape for the ping-pong ball, fit a sphere with the reconstructed point cloud data, and determine the radius of the sphere. The

*k*value that gives the closest radius value to the actual ping-pong ball radius is the

*k*

_{0}that we are looking for, i.e.,

*k*

_{0}=

*k*.

Once *k*_{0} is determined, we can recover 3D geometry of the sphere, and determine the minimum *z*=*z _{min}* for Φ

*generation. Typically, if the ping-pong ball is slightly closer to the camera than the object, the minimum*

_{min}*z*value on the sphere can be used as

*z*directly. The computed Φ

_{min}*map is then used to unwrap the object surface pixel by pixel by using Eq. (10). Finally, the unwrapped phase is used to recover the entire 3D surface geometry.*

_{min}It should be noted that in this research, we use a ping-pong ball as an example for this proposed pipeline because its surface geometry is simple and uniform; and it is very inexpensive and very easy to obtain. In practice, one can use any known object to replace the ping-pong ball and the entire pipeline can still be adapted to determine *k*_{0}. If the object surface is complex, one may measure the object in advance and determine *k*_{0} by solving the following minimization problem:

**S**is the pre-measured surface geometry, and

^{o}**S**the reconstructed 3D geometry for fringe order with an offset of

^{k}*k*. Measuring the distance between two arbitrary surfaces is often complex, involving surface registration and complex discrete point distance to surface computation. Fortunately, the open source software package such as MeshLab (http://meshlab.sourceforge.net) offers the iterative closest point (ICP) for surface registration, the root-mean-square (rms) value of two registered point cloud data sets can be used to evaluate the

*closeness*of two surfaces.

## 3. Experiment

We developed a 3D shape measurement system to test the performance of our proposed method. The hardware system includes a single charge-coupled device (CCD) camera (Model: Pointgray Grasshopper3) and a digital light processing (DLP) projector (Model: LightCrafter 4500). The camera is attached with a 25 mm focal length lens (Model: Fujinon CF25HA-1). The camera resolution was set as 1280 × 960 pixels and the projector’s resolution is 1140 × 912 pixels. The projector projects binary structured patterns at 100 Hz, and the camera captures simultaneously with external trigger signals generated by Arduino. The entire system was calibrated by employing the structured light system calibration method developed by Li et al. [23].

We first verified that our optimization algorithm can generate absolute phase by measuring ping-pong balls. Figure 3 shows a single ping-pong ball measurement results. Figure 3(a) shows one of the three phase-shifted fringe patterns, and Figure 3(b) shows the wrapped phase. Since the sphere surface is smooth, we employed a spatial phase unwrapping algorithm [22] to unwrap the phase, and Fig. 3(c) shows the spatially unwrapped phase map.

Since the spatially unwrapped phase is a *relative* phase, the phase map has to be properly shifted by 2*π* × *k*_{0} to recover an absolute 3D shape. We employed our optimization algorithm on the relative phase by adding different number of 2*π*’s. Figure 4 shows three 3D frames during the optimization process with different *k* values. As one can see, when *k* changes, the size of the reconstructed sphere changes accordingly. Figures 4(a)–4(c) show 3D results when *k* = 15, 16, and 17, respectively. Figures 4(d)–4(f) show the cross sections of Fig. 4(a)–4(c). Visually, when *k* = 16, the reconstructed sphere matches with the ideal sphere fairly well.

To automatically determine a proper number of 2*π* to shift relative phase vlaues, we fitted a sphere for each 3D geometry reconstructured, and find the radius of each fitted sphere. Figure 5 shows the fitted sphere radius with respect to the changes of *k*. This figure shows that the relationship between estimated radius and *k* are monotonic; and *k* = 16 gives the radius that is closest to 20 mm. So we use *k*_{0} = 16 for absolute phase recovery.

Once *k*_{0} is obtained, the absolute phase can be obtained, and further 3D shape can be recovered. Figure 6(a) and 6(b) respectively shows unwrapped absolute phase and 3D reconstructed shape. We then compared the result obtained from our proposed method with that obtained from the conventional temporal phase unwrapping method [10]. Figure 6(c) shows the reconstructed absolute phase by the temporal phase unwrapping method; and Fig. 6(d) shows the corresponding 3D shape; they appear identical.

We then plotted the same cross sections of the unwrapped phase, as shown in Fig. 7(a). From Fig. 7(a), we can find that the unwrapped phase maps perfectly overlap with each other. Similarly, we plotted the same cross sections of the reconstructed 3D shapes, and Figure 7(b) shows the result. Once again, they are identical. This experiment demonstrated that our proposed method can successfully measure a known-size ping-pong ball. Our result is consistent with the conventional absolute shape measurement technique, albeit it requires a lot less number of images.

To further verify that our proposed method can work well for multiple known size objects at different depths *z*. We simultaneously measured five ping-pong balls. Figure 8 shows the measurement results. Once again, we compared the results between our method and the conventional temporal phase unwrapping method. Figure 8 shows that they are identical to our results. This experiment further demonstrated that our proposed optimization method can indeed recover an absolute phase map of multiple isolated known objects (e.g., ping-pong balls) without capturing additional images.

After proving that our proposed optimization method can be used to recover absolute shape of the ping-pong ball at different depths and different spatial locations, we can use that to assist absolute phase unwrapping for arbitrary objects. We simultaneously measured one ping-pong ball and two sculptures. The ping-pong ball was positioned slightly closer to the camera than the sculptures. Figure 9 shows the result. Figure 9(a) shows one of three phase-shifted fringe patterns. From three phase-shifted fringe patterns, we can obtain the texture image and the wrapped phase map, as shown in Fig. 9(b) and 9(c) respectively. From the texture image, we segment the ping-pong ball and obtain the wrapped phase accordingly, as shown in Fig. 9(d) and Fig. 9(e). Figure 9(f) shows the spatially unwrapped phase of the segmented ball. After employing our proposed optimization algorithm, absolute shape of the ping-pong ball can be obtained, as shown in Fig. 9(g). Finally, we reconstructed the 3D absolute shape and the result is shown in Fig. 9(h).

The reconstructed 3D geometry of the ping-pong ball gives the *z _{min}* value to generate the minimum phase map, Φ

*, as discussed in Sec. 2.2. After Φ*

_{min}*is obtained, the entire phase map can be unwrapped pixel by pixel. Figure 10(a) shows the unwrapped phase map, from which we can recover 3D geometry of the entire scene. The 3D result is shown in Fig. 10(b). As before, we compared our measurement result with that obtained from the temporal phase unwrapping method [10]. Figure 10(c) shows the unwrapped phase and Fig. 10(d) shows the the corresponding 3D reconstruction. Once again, they are identical to the results obtained from our proposed method.*

_{min}Furthermore, we measured two isolated objects at distinct depth z, as shown in Fig. 11. Since the depth span of these two objects is larger than the maximum sensing range that the Φ* _{min}* can handle, two ping-pong balls are required with each being positioned close to one object. Figure 11(a) shows one fringe image of the captured scene. We segment each ping-pong ball and its associated with the object, and separate them into two sub images and process each sub image following the same procedures as we used before. The top half images of Figs. 11(b) and 11(c) show the segmented sub-images; and the bottom half images of Figs. 11(b) and 11(c) show the recover 3D shape for each sub image.

We combine two separately recovered 3D shapes into one entire 3D object. Figure 12 shows the complete measurement result. Again, we measured the same object using gray coding, and the result is shown in Fig. 12(b). Figure 12(c) shows two cross sections of the measurement results using our proposed method and the gray coding method. Once again, they are identical.

Lastly, we experimentally verified that our proposed method can be used to measure objects moving along the depth *z* direction. In this experiment, we measured a ping-pong ball moving along with a hand. We projected two different frequency fringe patterns that allow us to recover absolute phase using the enhanced two frequency method [24] for absolute phase unwrapping. Our method only uses three higher frequency fringe patterns for 3D shape recovery. Figure 13 as well as the associated videos
Visualization 1 and
Visualization 2 show the measurement results. Figure 13(a) shows one frame of the 3D result obtained from our proposed method using the three high-frequency fringe patterns. Figure 13(b) shows the result of the same frame using the enhanced two-frequency phase-shifting method using 6 fringe patterns. This experiment successfully demonstrated that our proposed method can be used to measure dynamic object moving along depth *z* direction, enhancing the geometric-constraint based phase unwrapping method developed by An et al. [19]. Since only three fringe patterns are used, this proposed method is very suitable for high-speed measurement applications.

It is important to note that, on these two visualizations, one may notice some periodical error on those reconstructed 3D data. The periodical error was a result of motion artifacts due to the “slow” 2D fringe image acquisition speed (i.e., 100 Hz). It takes approximately 33 ms to capture three phase-shifted fringe patterns to reconstruct one 3D frame. For slowly moving hand portion, 33 ms is short enough without introducing any problems. However, for some portion of the video recording, the hand moves faster and 33 ms is too long. As one can clearly see, the motion-introduced artifacts appears similarly on the results reconstructed from our method and the results from the two-frequency phase-shifting method.

## 4. Summary

This paper has presented a novel method for absolute three-dimensional (3D) shape measurement that uses a known-size object (i.e., a ping-pong ball with a radius of 20 mm) to provide cues for absolute phase unwrapping. The phase of segment ping-pong ball is spatially unwrapped to retrieve relative phase, and the integer multiple of 2*π* to be added for absolute phase recovery is obtained through an optimization process. The nearest point of the reconstructed 3D ball provides *z _{min}* to generate the minimum phase Φ

*for pixel-by-pixel phase unwrapping of the entire wrapped phase map. Experimental results demonstrated the success of our proposed method to measure absolute shape of moving objects using only three phase-shifted fringe patterns.*

_{min}## Funding Information

National Science Foundation (NSF) (CMMI-1531048); Natural Science Foundation of Zhejiang Province of China (NSFZC) (LY14F020025).

## Acknowledgments

The authors would like to thank other graduate and undergraduate students in our laboratory for their valuable discussions and for the assistance on experiments. This work was not possible without their support and help.

## References and links

**1. **S. Zhang, “Recent progresses on real-time 3-d shape measurement using digital fringe projection techniques,” Opt. Laser Eng. **48**, 149–158 (2010). [CrossRef]

**2. **D. Malacara, ed., *Optical Shop Testing*, 3rd ed. (John Wiley and Sons, 2007). [CrossRef]

**3. **D. C. Ghiglia and M. D. Pritt, eds., *Two-Dimensional Phase Unwrapping: Theory, Algorithms, and Software* (John Wiley and Sons, 1998).

**4. **X. Su and W. Chen, “Reliability-guided phase unwrapping algorithm: a review,” Opt. Laser Eng. **42**, 245–261 (2004). [CrossRef]

**5. **M. Zhao, L. Huang, Q. Zhang, X. Su, A. Asundi, and Q. Kemao, “Quality-guided phase unwrapping technique: comparison of quality maps and guiding strategies,” Appl. Opt. **50**, 6214–6224 (2011). [CrossRef] [PubMed]

**6. **Y.-Y. Cheng and J. C. Wyant, “Two-wavelength phase shifting interferometry,” Appl. Opt. **23**, 4539–4543 (1984). [CrossRef] [PubMed]

**7. **Y.-Y. Cheng and J. C. Wyant, “Multiple-wavelength phase shifting interferometry,” Appl. Opt. **24**, 804–807 (1985). [CrossRef]

**8. **D. P. Towers, J. D. C. Jones, and C. E. Towers, “Optimum frequency selection in multi-frequency interferometry,” Opt. Lett. **28**, 1–3 (2003). [CrossRef]

**9. **G. Sansoni, M. Carocci, and R. Rodella, “Three-dimensional vision based on a combination of gray-code and phase-shift light projection: Analysis and compensation of the systematic errors,” Appl. Opt. **38**, 6565–6573 (1999). [CrossRef]

**10. **S. Zhang, “Flexible 3d shape measurement using projector defocusing: Extended measurement range,” Opt. Lett. **35**, 931–933 (2010).

**11. **K. Zhong, Z. Li, Y. Shi, and C. Wang, “Analysis of solving the point correspondence problem by trifocal tensor for real-time phase measurement profilometry,” Proc. SPIE , **8493**, 849311 (2012). [CrossRef]

**12. **Z. Li, K. Zhong, Y. Li, X. Zhou, and Y. Shi, “Multiview phase shifting: a full-resolution and high-speed 3d measurement framework for arbitrary shape dynamic objects,” Opt. Lett. **38**, 1389–1391 (2013). [CrossRef] [PubMed]

**13. **K. Zhong, Z. Li, Y. Shi, C. Wang, and Y. Lei, “Fast phase measurement profilometry for arbitrary shape objects without phase unwrapping,” Opt. Laser Eng. **51**, 1213–1222 (2013). [CrossRef]

**14. **C. Brauer-Burchardt, P. Kuhmstedt, and G. Notni, “Phase unwrapping using geometric constraints for high-speed fringe projection based 3d measurements,” Proc. SPIE , **8789**, 878906 (2013). [CrossRef]

**15. **R. Ishiyama, S. Sakamoto, J. Tajima, T. Okatani, and K. Deguchi, “Absolute phase measurements using geometric constraints between multiple cameras and projectors,” Appl. Opt. **46**, 3528–3538 (2007). [CrossRef] [PubMed]

**16. **C. Brauer-Burchardt, P. Kuhmstedt, M. Heinze, P. Kuhmstedt, and G. Notni, “Using geometric constraints to solve the point correspondence problem in fringe projection based 3d measuring systems,” Lect. Notes Comput. Sci. **6979**, 265–274 (2011). [CrossRef]

**17. **Y. R. Huddart, J. D. R. Valera, N. J. Weston, and A. J. Moore, “Absolute phase measurement in fringe projection using multiple perspectives,” Opt. Express **21**, 21119–21130 (2013). [CrossRef] [PubMed]

**18. **W. Lohry, V. Chen, and S. Zhang, “Absolute three-dimensional shape measurement using coded fringe patterns without phase unwrapping or projector calibration,” Opt. Express **22**, 1287–1301 (2014). [CrossRef] [PubMed]

**19. **Y. An, J.-S. Hyun, and S. Zhang, “Pixel-wise absolute phase unwrapping using geometric constraints of structured light system,” Opt. Express **24**, 18445–18459 (2016). [CrossRef] [PubMed]

**20. **S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. **45**, 083601 (2006). [CrossRef]

**21. **Y. An, Z. Liu, and S. Zhang, “Evaluation of pixel-wise geometric constraint-based phase unwrapping method for low signal-to-noise-ratio (snr) phase,” Adv. Opti. Technol. **5**, 423–432 (2016).

**22. **S. Zhang, X. Li, and S.-T. Yau, “Multilevel quality-guided phase unwrapping algorithm for real-time three-dimensional shape reconstruction,” Appl. Opt. **46**, 50–57 (2007). [CrossRef]

**23. **B. Li, N. Karpinsky, and S. Zhang, “Novel calibration method for structured light system with an out-of-focus projector,” Appl. Opt. **53**, 3415–3426 (2014). [CrossRef] [PubMed]

**24. **J.-S. Hyun and S. Zhang, “Enhanced two-frequency phase-shifting method,” Appl. Opt. **55**, 4395–4401 (2016). [CrossRef] [PubMed]