Abstract
Two major methods for 3D reconstruction in fringe projection profilometry, phase-height mapping and stereovision, have their respective problems: the former has low-flexibility in practical application due to system restrictions and the latter requires time-consuming homogenous points searching. Given these limitations, we propose a phase-3D mapping method developed from back-projection stereovision model to achieve flexible and high-efficient 3D reconstruction for fringe projection profilometry. We showed that all dimensional coordinates (X, Y, and Z), but not just the height coordinate (Z), of a measured point can be mapped from phase through corresponding rational functions directly and independently. To determine the phase-3D mapping coefficients, we designed a flexible two-step calibration strategy. The first step, ray reprojection calibration, is to determine the stereovision system parameters; the second step, sampling-mapping calibration, is to fit the mapping coefficients using the calibrated stereovision system parameters. Experimental results demonstrated that the proposed method was suitable for flexible and high-efficient 3D reconstruction that eliminates practical restrictions and dispenses with the time-consuming homogenous point searching.
© 2017 Optical Society of America
1. Introduction
Fringe projection profilometry (FPP) is a widely used optical three-dimensional (3D) technology that benefits from the advantages of high speed, high accuracy and high resolution [1,2]. The FPP system generally adopts projector-camera setup to project sinusoidal fringe patterns and then capture distorted fringe images for phase computation and final 3D reconstruction. Among existing reconstruction methods in FPP, there are two main classical categories: phase-height mapping (PHM) [3–25] and stereovision (SV) [26–40].
PHM, which has been studied since 1980s, lends itself to efficient conversion of height-modulated phase to height coordinates [3–22]. Takeda and Mutoh [3] derived a formula for converting phase difference into height relative to a reference plane by means of similar triangle theory. Liu et al. [9] presented a mapping model representing the intersection of a line of sight and equiphasic planes. Du et al. [14] established another model mapping the phase to the relative height of objects through line equations of acquisition and projection. However, PHM has some practical restrictions; for example, system structure is restricted geometrically so that the optical axis of the camera or the projector is perpendicular to the reference plane, the connecting line of the optical centers of the camera and the projector is parallel to the reference plane, etc. PHM generally requires a reference plane to compute the phase difference and relative height, which limits the measuring volume. Moreover, PHM needs a translation stage or some gage blocks to obtain precise heights for calibration, which leads to loss of accuracy for measured points lying outside the calibrated volume. Studies reporting alternative mapping methods used an approximate polynomial to represent the mapping relation directly [23–25]. For instance, Vargas et al. [23] described the relationship between the depth coordinate and the absolute phase by a monotonic polynomial. These methods achieved polynomial fitting depending on camera calibration instead of translation stage or gage block, and no longer needed a reference plane. Nevertheless, the approximate polynomial is lack of strict derivation associated with imaging models, so that the degree of polynomial cannot be determined reasonably. Especially, lens distortion has not yet been fully taken into account. In fact, the degree of polynomial is closely related to the lens distortion model.
In contrast, SV-based FPP system can be set as a binocular framework with more degree of freedom to relax the restrictions PHD requires. By searching for homologous image points between the camera and the projector, 3D coordinates of measured points can be reconstructed once system parameters have been determined. The system parameters can be worked out through a flexible SV calibration. Legarda-Sáenz et al. [30] introduced a rigidity constraint between the camera and the projector to simultaneously estimate the system parameters for improving calibration reliability. Zhang and Huang [31] used a novel red/blue checkerboard instead of conventional black/white one to make the calibration of the projector the same as that of the camera. In Chen’s method [35], a plane target was moved along its normal direction to construct an accurate 3D lattice of benchmarks distributed uniformly within the measurement volume, which made the distribution of calibration error as uniform as possible. Yin et al. [38] presented a FPP calibration with a bundle adjustment strategy to accomplish a more accurate result, even when using an imperfectly printed target pattern. SV using a linear model without considering lens distortion is efficient. However, the process of homologous point searching consumes nonnegligible time cost.
In this paper, we propose a novel phase mapping method to make best use of the advantages and bypass the disadvantages of PHM and SV. We derived a phase-3D mapping (P3DM) model in strict accordance with the back-projection SV model. Based on P3DM, all dimensional coordinates (X, Y, and Z) of a measured point are associated only with the phase, so that they can be reconstructed from the phase directly and independently, avoiding time-consuming processes of coordinate transformation and homogenous points searching required in SV method. To determine the mapping coefficients of P3DM, we designed a flexible two-step calibration strategy. The first step is ray reprojection calibration to determine system parameters of the back-projection SV model; the second step is sampling mapping calibration to fit mapping coefficients of the P3DM model by using the calibrated system parameters. As a result, P3DM can achieve flexible and high-efficient 3D reconstruction without practical limitations.
2. Imaging model
The FPP system typically consists of a digital camera and a digital-micromirror-device (DMD) projector (Fig. 1). Since orthogonal absolute phases can provide correspondence between the camera and DMD image points, the projector can be treated as an inverse imaging process [31]. So, the FPP system can be modeled as a binocular system. To facilitate elaboration of method, in this section we briefly describe some related imaging models.
2.1. Back-projection camera model
The imaging process of a camera is perspective projection. An object point, denoted as in a world coordinate system (WCS) and in a camera coordinate system (CCS), is projected onto an image plane through a projection center. The process can be modeled as
where denotes a homogeneous coordinate; is the projection of on the image plane (the image point denoted as in an image coordinate system); is a rotation matrix corresponding to three rotation angles , and is a translation vector, representing transformation from the WCS to the CCS; is a camera parameter matrix with respect to equivalent focus lengths along the image coordinate axes, image coordinates of principle point , and skew factor of image axes ; and is a scale factor.In practice, aberrations in the imaging lens inevitably exist, so that lens distortion should be considered in the camera model. The lens distortion can be modeled as
where denotes a distorted image point, and is a distortion correction term associated with radial and decentering distortion, is distance from the distorted image point to the principle point, and are radial and decentering distortion parameters. Let be a vector of lens distortion parameters. Generally three terms of radial distortion and two terms of decentering distortion are sufficient, i.e., . Thus the complete camera model involving lens distortion can be represented asThis is a back-projection camera model, which can project an image point to an object point up to a scale factor and is thus accord with the process of 3D reconstruction.2.2. Back-projection SV model
The projector can be treated as an inverse camera that thus can adopt the above-mentioned camera model, as long as the subscript c is changed to the subscript p. When the camera and the projector are fixed with respect to each other, structural parameters and can be introduced to represent a rigid transformation from a CCS to a projector coordinate system (PCS):
Therefore, the back-projection SV model can be represented as
3. Phase-3D mapping
In this section, we prove that there exist mappings from phase to all dimensional coordinates (X, Y, and Z) of a measured point in strict accordance with the back-projection SV model. It means that an image point of given phase can be mapped to its corresponding 3D spatial point directly, without any time-consuming processes of coordinate transformations or homologous point searching. The following is the derivation of P3DM.
Supposing vertical fringe patterns are projected in FPP. On the DMD image plane the phase is proportional to the image coordinate . Moreover, for the homologous image points and , their absolute phases are equal, i.e. . Thus there is a linear mapping from to such that
where the superscript L represents a linear mapping relationship.The image point defines an epipolar line on the DMD image plane based on the epipolar geometry constraint, as shown in Fig. 2. Because of lens distortion, the actual (i.e. distorted) epipolar line is a curve . The curve is bounded that it can be approximated to be a polynomial curve in term of the Weierstrass approximation theorem. Since the homologous image point related to is on , the image coordinate can be a polynomial function of such that
where the superscript P represents a polynomial mapping relationship.The image point can be transformed to be in PCS. From Eq. (5), the transformation can be represented as
Equation (8) contains linear mappings from and to and such thatThe distorted image point then is corrected to be an undistorted image point . According to the lens distortion model of Eq. (2), the coordinate corrections are polynomial mappings from and to and such that
Finally, two rays back-projected from the image points and intersect at a 3D spatial point . According to the back-projection SV model of Eq. (5), the ray intersection can be formulized as
Considering the epipoloar geometrical constraint, i.e. where denotes an antisymmetric matrix associated with , Eq. (11) has three degrees of freedom. So 3D coordinates of can be worked out exactly such thatwhere and are elements of and , respectively. For a fixed FPP system and a specific image point, the parameters and and the image coordinates and are determined. Thus, the spatial coordinates , and are functions of . We can rewrite Eq. (12) aswhere , and are constants.By combining Eqs. (6), (7), (9), (10) and (13), we derive the mappings from phase to 3D coordinates respectively to be
where are mapping coefficients. This is the P3DM developed from the back-projection SV model for FPP. The 3D reconstruction is related only to the phase in which all dimensional coordinates (X, Y, and Z) can be mapped from a phase directly and independently. Therefore, P3DM can achieve high-efficient 3D reconstruction.In addition, the degree N, of polynomial, is defined by the lens distortion and the distorted epipolar line. Since we adopt the lens distortion model with three radial terms and two decentering terms, the function in Eq. (10) is a 7th degree polynomial. And supposing the distorted epipolar line in Eq. (7) is sufficiently approximated to be a 5th degree polynomial curve. Then, the degree N of P3DM are 35. A polynomial of high degree may be susceptible to disturbance when the value of independent variable is large and increase computational complexity and storage space. Accordingly, in practical application, the degree of polynomial can be adjusted via trade-off of accuracy and efficiency.
4. P3DM method
Based on the derivation of P3DM, we propose the P3DM method, including flexible two-step calibration and high-efficient 3D reconstruction. The overall flow chart is shown in Fig. 3.
4.1. P3DM calibration
The goal of P3DM calibration is to determine the mapping coefficients for the FPP system to reconstruct 3D surface profiles of objects from the modulated phase. Actually, each camera image point can be calibrated with an independent mapping coefficient set , so that each dimension of each measured point can be reconstructed independently. We designed a two-step calibration strategy: ray reprojection calibration and sampling mapping calibration, to determine the mapping coefficients.
Ray reprojection calibration
The ray reprojection calibration is to determine system parameters of the back-projection SV model, involving parameter matrixes and , lens distortion vectors and , and structural parameters and . For concision and clarity, we define symbols and . Hence the system parameters are , and .
Traditional camera calibration is to minimize image reprojection error corresponding to the image distance between a projected image point and a measured one [30, 38]. It is suitable for the forward-projection camera model projecting an object point to an image one. Because the SV model used to derive P3DM is based on the back-projection camera model, traditional calibration strategy cannot be applied in this paper. In this case, we take account ray reprojection, which back-projects an image point to be a spatial ray through the projection center. Ideally, the back-projected spatial ray will pass through the object point related to the image point. Therefore, ray reprojection calibration can be performed by minimizing ray reprojection error corresponding to the spatial distance from an object point to a back-projected spatial ray.
A planar target on which there are M benchmarks with known world coordinates is placed in N different orientations arbitrarily. In each orientation, one image of the planar target under uniform illumination is employed to locate image positions of benchmarks . The others under orthogonal fringe projection are utilized to compute crossed absolute phase maps. Then homologous image points on the DMD image plane can be acquired from the phase maps. The ray reprojection calibration minimizing the sum of ray reprojection error can be represented as
where is the calibrated system parameter set and denotes the ray reprojection error. In general, the epipolar constraint is taken account in the calibration of binocular system. In this situation, the calibration procedure is modified aswhere denotes distance from a homologous point to its corresponding epipolar line. Then an appropriate optimization algorithm such as Levenberg–Marquardt can be employed to work out the system parameters.Sampling mapping calibration
The sampling mapping calibration is to determine the mapping coefficients of Eq. (14) using the calibrated system parameters. In this procedure, sufficient points with known 3D coordinate and absolute phase are required for coefficient fit. In the P3DM method, these data can be obtained by means of the calibrated system parameters, without the aid of transformation stage or gage block.
The flexible sampling mapping calibration is summarized as follow:
- Step 1, back-project an image point to be a spatial line using the calibrated system parameter ;
- Step 2, sample a series of points , where , along the space line within a measurement volume, then project these sampling points onto the DMD image plane to obtain the corresponding image points using the system parameters and , work out the absolute phases from these image points to make up phase-3D pairs ;
- Step 3, coefficient fit based on these phase-3D pairs to obtain the mapping coefficients ;
- Step 4, repeat the first three steps for every image point, and finally establish a P3DM coefficient look-up-table (LUT), denoting it as .
4.2. P3DM reconstruction
In P3DM-based FPP system, 3D reconstruction can be performed in a very efficient way. After P3DM calibration, the high-efficient P3DM reconstruction includes the following three steps:
- Step 1, project a group of single-direction fringe patterns onto the surface of measured objects, and then capture the fringe images to retrieve an absolute phase map ;
- Step 2, determine the mapping coefficients corresponding to the image points from the LUT;
- Step 3, substitute the phase and the mapping coefficients in Eq. (14) to reconstruct the corresponding X-, Y- and Z-coordinates.
5. Experiments and analysis
In this section, experiments and analysis regarding validity and performance of the proposed P3DM method are demonstrated. We set up an FPP system consisting of a camera (DH MER-130-30UM, 1280*1024) with a 16mm TV lens (PANTAX) and a projector (Dell M110, 1280*800). The experiments included P3DM calibration and P3DM reconstruction.
During P3DM calibration, the ray reprojection calibration was implemented first. A planar target with 99 circle benchmarks was taken as a standard calibration reference. To identify homologous benchmarks between the camera and DMD image planes, cross phase maps retrieved from orthogonal fringe patterns were utilized. In our experiment, the planar target was placed on 7 positions in a measuring volume of 400mmx400mmx300mm. Figure 4 shows the target at one of calibrated positions: the cross phase maps in Figs. 4(a) and 4(b) and the located benchmarks respectively on the camera and DMD image planes in Fig. 4(c). We chose the strategy of Eq. (16) to calibrate the binocular system by minimizing the sum of ray reprojection error and epipolar distance error.
The final calibrated parameters are listed in Table 1. Using these parameters, the ray reprojection at the special position is visualized in Fig. 5(a). Each pair of rays back-projected from the homologous image points are very close to the corresponding benchmark on the target. To analyze the calibration precision, we reconstructed the 3D coordinates of these benchmarks making use of the calibrated system parameters. Then the reconstruction result was compared with the known dimensions to obtain the reconstruction error in SV. Error distributions of the reconstructed 3D coordinates related to image coordinates in the special position are shown in Fig. 5(b). The errors are randomly distributed because the systematic error introduced by the lens distortion is significantly eliminated.
Subsequently, sampling mapping calibration was implemented by means of the calibrated system parameters. In the experiment, we back-projected rays from each image point then sampled 101 points on each back-projected ray. Using these sampled data, the mapping coefficients can be calculated through an optimization procedure. However, we found that the optimization result was sensitive to initial value and converged locally. Then we considered the polynomial form in the right side of Eq. (14). In our experiment, the constant terms were constrained to be zero, i.e. . In this case Eq. (14) can be changed to be polynomial forms such that
The polynomial fit can simply be implemented without initial value. We used a 4th degree polynomial for X- and Y-coordinate and a 5th degree polynomial for Z-coordinate. A mapping coefficient set related to an image point (240,112) is listed in Table 2, the corresponding fit-error curves are shown in Fig. 6. In Table 3 are the maximum (MAX) and root-mean-square (RMS) values of fit errors related to different camera image points. It can be seen that the fit errors were less than 0.1 micrometer. They are quite small, indicating that the degree of polynomial selected in our experiment is sufficiently accurate for field application.After the P3DM calibration is finished, the P3DM reconstruction can be performed for the FPP system. We utilized a standard sphere to test the precision of the P3DM reconstruction, as shown in Fig. 7(a). The standard sphere was reconstructed by means of the P3DM method and the SV method, respectively. Figures 7(b) and 7(c) show respective results of 3D reconstruction. Moreover, the corresponding MAX and RMS values associated with point distances between the two 3D models in different dimensions are listed in Table 4. It can be seen that the point distances are quite small in all dimensions. Therefore, the P3DM method has the same precision level as the SV method. Even then, it should be noticed that the P3DM method containing polynomial operations may smooth the reconstructed geometry in a small degree.
Although the precisions of P3DM and SV are identical, their efficiencies are significantly different. In P3DM, all dimensional coordinates can be mapped from phase directly and independently. Therefore, the efficiency of 3D reconstruction of P3DM is higher than that of SV. To demonstrate this point, we compared the methods in terms of the time cost of 3D reconstruction in three different scenes. The results of 3D reconstruction are shown in Fig. 8, and the corresponding data are listed in Table 5. The time cost of SV reconstruction is much larger than that of P3DM reconstruction. This becomes more significant when the number of measured points increases.
At this point, our experiments indicated that the P3DM method has the same precision, but significantly higher efficiency of 3D reconstruction in comparison with the SV method. By making use of the high-efficient P3DM, the FPP can implement high-speed 3D imaging. In the following experiment, such a high-speed FPP system was set up consisting of a BASLER camera (ACA640-120GM, 640x480) with a 12mm TV lens (PANTAX) and a TI DLP projector (LightCrafter 3000, 608x684), as shown in Fig. 9. The time cost of 3D reconstruction was reduced to 3ms with aid of GPU computation. Consequently, high-speed 3D imaging and rendering of moving objects were performed via this FPP system. Figure 10 shows the 3D imaging of a moving plaster model rotated by a turning disc (Visualization 1).
6. Conclusion
In this paper, we propose a P3DM method based on the back-projection SV model for FPP, in which all dimensional coordinates (X, Y, and Z) can be mapped from phase directly and independently. Correspondingly, a flexible two-step calibration strategy, including ray reprojection calibration and sampling mapping calibration, is proposed to determine the mapping coefficients of P3DM. P3DM has the same precision and significant higher efficiency of 3D reconstruction when compared with SV. This verifies that P3DM is suitable for flexible and high-efficient 3D reconstruction in practical application.
Funding
National Natural Science Foundation of China (NSFC) (61377017); Scientific and Technological Project of the Shenzhen Government (JCYJ20140828163633999).
References and Links
1. S. S. Gorthi and P. Rastogi, “Fringe projection techniques: Whither we are?” Opt. Lasers Eng. 48(2), 133–140 (2010). [CrossRef]
2. S. Zhang, “Recent progresses on real-time 3D shape measurement using digital fringe projection techniques,” Opt. Lasers Eng. 48(2), 149–158 (2010). [CrossRef]
3. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shapes,” Appl. Opt. 22(24), 3977–3982 (1983). [CrossRef] [PubMed]
4. V. Srinivasan, H. C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-D diffuse objects,” Appl. Opt. 23(18), 3105–3108 (1984). [CrossRef] [PubMed]
5. G. Sansoni, L. Biancardi, U. Minoni, and F. Docchio, “A novel, adaptive system for 3-D optical profilometry using a liquid crystal light projector,” IEEE Trans. Instrum. Meas. 43(4), 558–566 (1994). [CrossRef]
6. W. S. Zhou and X. Y. Su, “A direct mapping algorithem for phase-measuring profilometry,” J. Mod. Opt. 41(1), 89–94 (1994). [CrossRef]
7. A. Asundi and Z. Wensen, “Unified calibration technique and its applications in optical triangular profilometry,” Appl. Opt. 38(16), 3556–3561 (1999). [CrossRef] [PubMed]
8. R. Sitnik, M. Kujavinska, and J. Woznicki, “Digital fringe projection system for large-volume 360-deg shape measurement,” Opt. Eng. 41(2), 443–449 (2002). [CrossRef]
9. H. Liu, W.-H. Su, K. Reichard, and S. Yin, “Calibration-based phase-shifting projected fringe profilometry for accurate absolute 3D surface profile measurement,” Opt. Commun. 216(1–3), 65–80 (2003). [CrossRef]
10. X. Peng, Z. Yang, and H. Niu, “Multi-resolution reconstruction of 3-D image with modified temporal unwrapping algorithm,” Opt. Commun. 224(1–3), 35–44 (2003). [CrossRef]
11. Z. Zhang, D. Zhang, and X. Peng, “Performance analysis of a 3D full-field sensor based on fringe projection,” Opt. Lasers Eng. 42(3), 341–353 (2004). [CrossRef]
12. B. A. Rajoub, D. R. Burton, and M. J. Lalor, “A new phase-to-height model for measuring object shape using collimated projections of structured light,” J. Opt. A, Pure Appl. Opt. 7(6), S368–S375 (2005). [CrossRef]
13. P. Jia, J. Kofman, and C. English, “Comparison of linear and nonlinear calibration methods for phase-measuring profilometry,” Opt. Eng. 46(4), 043601 (2007). [CrossRef]
14. H. Du and Z. Wang, “Three-dimensional shape measurement with an arbitrarily arranged fringe projection profilometry system,” Opt. Lett. 32(16), 2438–2440 (2007). [CrossRef] [PubMed]
15. A. Maurel, P. Cobelli, V. Pagneux, and P. Petitjeans, “Experimental and theoretical inspection of the phase-to-height relation in Fourier transform profilometry,” Appl. Opt. 48(2), 380–392 (2009). [CrossRef] [PubMed]
16. L. Huang, P. S. Chua, and A. Asundi, “Least-squares calibration method for fringe projection profilometry considering camera lens distortion,” Appl. Opt. 49(9), 1539–1548 (2010). [CrossRef] [PubMed]
17. Z. Zhang, H. Ma, S. Zhang, T. Guo, C. E. Towers, and D. P. Towers, “Simple calibration of a phase-based 3D imaging system based on uneven fringe projection,” Opt. Lett. 36(5), 627–629 (2011). [CrossRef] [PubMed]
18. Y. Xiao, Y. Cao, and Y. Wu, “Improved algorithm for phase-to-height mapping in phase measuring profilometry,” Appl. Opt. 51(8), 1149–1155 (2012). [CrossRef] [PubMed]
19. I. Léandry, C. Brèque, and V. Vallea, “Calibration of a structured-light projection system: Development to large dimension objects,” Opt. Lasers Eng. 50(3), 373–379 (2012). [CrossRef]
20. Z. Zhang, S. Huang, S. Meng, F. Gao, and X. Jiang, “A simple, flexible and automatic 3D calibration method for a phase calculation-based fringe projection imaging system,” Opt. Express 21(10), 12218–12227 (2013). [CrossRef] [PubMed]
21. Z. Huang, J. Xi, Y. Yu, Q. Guo, and L. Song, “Improved geometrical model of fringe projection profilometry,” Opt. Express 22(26), 32220–32232 (2014). [CrossRef] [PubMed]
22. N. Karpinsky, M. Hoke, V. Chen, and S. Zhang, “High-resolution, real-time three-dimensional shape measurement on graphics processing unit,” Opt. Eng. 53(2), 024105 (2014). [CrossRef]
23. J. Vargas, J. A. Quiroga, and M. J. Terron-Lopez, “Flexible calibration procedure for fringe projection profilometry,” Opt. Eng. 46(2), 023601 (2007). [CrossRef]
24. A. Li, X. Peng, Y. Yin, X. Liu, Q. Zhao, K. Körner, and W. Osten, “Fringe projection based quantitative 3D microscopy,” Optik (Stuttg.) 124(21), 5052–5056 (2013). [CrossRef]
25. J. Huang and Q. Wu, “A new reconstruction method based on fringe projection of three-dimensional measuring system,” Opt. Lasers Eng. 52, 115–122 (2014). [CrossRef]
26. V. Kirschner, W. Schreiber, R. M. Kowarschik, and G. Notni, “Self-calibration shape-measuring system based on fringe projection,” Proc. SPIE 3102, 5–13 (1997). [CrossRef]
27. C. Brenner, J. Boehm, and J. Guehring, “Photogrammetric calibration and accuracy evaluation of a cross-pattern stripe projector,” Proc. SPIE 3641, 164–172 (1998). [CrossRef]
28. T. Bothe, W. Osten, A. Gesierich, and W. P. O. Jueptner, “Compact 3D camera,” Proc. SPIE 4778, 48–59 (2002). [CrossRef]
29. J. Li, L. G. Hassebrook, and C. Guan, “Optimized two-frequency phase-measuring-profilometry light-sensor temporal-noise sensitivity,” J. Opt. Soc. Am. A 20(1), 106–115 (2003). [CrossRef] [PubMed]
30. R. Legarda-Sáenz, T. Bothe, and W. P. Jüptner, “Accurate procedure for the calibration of a structured light system,” Opt. Eng. 43(2), 464–471 (2004). [CrossRef]
31. S. Zhang and P. S. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45(8), 083601 (2006). [CrossRef]
32. S. Zhang, D. Royer, and S.-T. Yau, “GPU-assisted high-resolution, real-time 3-D shape measurement,” Opt. Express 14(20), 9120–9129 (2006). [CrossRef] [PubMed]
33. P. Kuhmstedt, C. Munckelt, M. Heinze, C. Brauer-Burchardt, and G. Notni, “3D shape measurement with phase correlation based fringe projection,” Proc. SPIE 6616, 66160B (2007). [CrossRef]
34. Z. Li, Y. Shi, C. Wang, and Y. Wang, “Accurate calibration method for a structured light system,” Opt. Eng. 47(5), 053604 (2008). [CrossRef]
35. X. Chen, J. Xi, Y. Jin, and J. Sun, “Accurate calibration for a camera-projector measurement system based on structured light projection,” Opt. Lasers Eng. 47(3–4), 310–319 (2009). [CrossRef]
36. K. Liu, Y. Wang, D. L. Lau, Q. Hao, and L. G. Hassebrook, “Dual-frequency pattern scheme for high-speed 3-D shape measurement,” Opt. Express 18(5), 5229–5244 (2010). [CrossRef] [PubMed]
37. X. Peng, X. Liu, Y. Yin, and A. Li, “Optical measurement network for large-scale and shell-like objects,” Opt. Lett. 36(2), 157–159 (2011). [CrossRef] [PubMed]
38. Y. Yin, X. Peng, A. Li, X. Liu, and B. Z. Gao, “Calibration of fringe projection profilometry with bundle adjustment strategy,” Opt. Lett. 37(4), 542–544 (2012). [CrossRef] [PubMed]
39. X. Liu, X. Peng, H. Chen, D. He, and B. Z. Gao, “Strategy for automatic and complete three-dimensional optical digitization,” Opt. Lett. 37(15), 3126–3128 (2012). [CrossRef] [PubMed]
40. R. Chen, J. Xu, H. Chen, J. Su, Z. Zhang, and K. Chen, “Accurate calibration method for camera and projector in fringe patterns measurement system,” Appl. Opt. 55(16), 4293–4300 (2016). [CrossRef] [PubMed]