Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

High-efficiency 3D reconstruction with a uniaxial MEMS-based fringe projection profilometry

Open Access Open Access

Abstract

Micro-Electro-Mechanical System (MEMS) scanning is increasingly popular in 3D surface measurement with the merits of the compact structure and high frame-rate. In this paper, we achieve real-time fringe structured 3D reconstruction by using a uniaxial MEMS-based projector. To overcome the limitations on uniaxial MEMS-based projector of lensless structure and unidirectional fringe projection, a novel isophase plane model is proposed, in which the laser line from MEMS-based projector is regarded as an isophase plane. Our model directly establishes the mapping relationship between phase and spatial 3D coordinates through the intersection point of camera back-projection light ray and isophase plane. Furthermore, a flexible calibration strategy to obtain 3D mapping coefficients is introduced with a specially designed planar target. Experiments demonstrated that our method can achieve high-accuracy and real-time 3D reconstruction.

© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Optical three-dimensional (3D) technology has been widely used for aerospace, biomedical, and many other industrial measurements [13]. Fringe projection profilometry (FPP) is a current research hotspot that benefits from the advantages of high speed, high accuracy and high resolution [411]. FPP system universally adopts the projector-camera format. The projector relies on digital micromirror device (DMD) and casts a sinusoidal fringe pattern; then the camera captures the distorted fringe images for phase computation and 3D reconstruction. Advances in image sensors and digital projection technology have further promoted its development in high-speed, real-time applications [1214].

In order to achieve high-speed 3D surface measurement, there are two aspects to be considered. The first aspect is to increase the speed of the hardware, including camera and projector, while the second aspect is to improve the efficiency of phase extraction and 3D reconstruction. Many studies focus on using DMD-based projector defocusing technology [1517] to generate sinusoidal fringe patterns from binary ones so that the projection speed can be increased sharply. To improve efficiency, novel techniques aim to retrieve phase with fewer fringe patterns [14,1825] and reduce the computational complexity in the 3D reconstruction [26]. In practical applications, the above two approaches are utilized to achieve high-accuracy and high-speed 3D reconstruction. Zhang et al. [18,27,28] applied Fourier transform profilometry [29] to high-speed 3D measurement that a single fringe pattern can recover phase. Zhang et al. [30,31] made a series of contributions to projector defocusing technology. Wang et al. [19,32,33] focused on employing bi-frequency and multi-frequency fringe code to achieve dynamic 3D reconstruction. Wu et al. [14,25,34] concentrated on using gray code to realize robust and efficient phase unwrapping. Zuo et al. [20,23,24,35] were committed to boosting the efficiency of phase-shifting profilometry using stereo phase unwrapping. For a high-speed 3D surface measurement system, a high-speed fringe projector with enough depth of field is important. MEMS-based projection technology has developed rapidly with the advantages of small size, high frame-rate and no demand of focusing optics [36,37]. In particular, a kind of compact MEMS-based projector using uniaxial mirror to scan full field at thousands of frames per second (fps) has great application potential for dynamic 3D measurement.

For MEMS-based FPP system calibration, two main limitations of lensless structure and unidirectional fringe projection should be addressed. In conventional DMD-based FPP system calibration, DMD-based projector can be regarded as anti-camera [38]. Therefore, the orthogonal fringe patterns can be projected for establishing the correspondence between DMD and charge-coupled device (CCD) or complementary metal–oxide–semiconductor (CMOS) of camera to accomplish system calibration with stereo-vision principle [26,3844]. However, MEMS-based projector with uniaxial mirror cannot project orthogonal fringe patterns, nor can be used as an anti-camera simply due to its lensless structure. The calibration method based on stereo-vision theory does not conform to MEMS-based FPP system. Then another kind of DMD-based FPP system calibration method of building the mapping relationship between phase or pixel coordinates and 3D coordinates should be studied for FPP system with a uniaxial MEMS-based projector.

Early research was mainly for DMD-based projector, which converted height-modulated phase into height coordinates [4552]. This approach has some practical restrictions. For example, the system’s structure is geometrically constrained and requires a translation stage or some gage blocks to obtain precise heights for calibration. To relax restrictions and achieve flexible calibration, scholars have made different improvements [5362]. Da et al. [53] analyzed the relationship between the fringe pattern on the projection plane of DMD-based projector and fringe in the 3D space, and proposed a flexible mapping method in combination with the camera. Huang et al. [55] considered the lens distortion of DMD-based projector and proposed cubic curve-fitting equations to describe the mapping relationship from pixel coordinates and phase to 3D coordinates. Zhang et al. [62] used unidirectional fringe patterns for DMD-based projector to establish a polynomial mapping relationship between phase and 3D coordinates. Although there are few reports on FPP system with a uniaxial MEMS-based projector, some of phase to 3D coordinates mapping methods [53,55,62] proposed for DMD-based projector are also applicable. MEMS-based projector is lensless, and these methods have fewer restrictions on it. Yang et al. [61] proposed a seven-parameter curved light surface model (CLSM) to describe the light plane, in which calibration is completed by fitting the parameter equation of each light stripe by CLSM. This method requires camera and projector to be parallel to ensure that the model parameters can be solved. In addition, during 3D reconstruction, the phase of object needs to be interpolated to find the corresponding curved surface parameters, which will reduce the computational efficiency.

In this paper, we propose a high-efficiency 3D reconstruction method for FPP system with a uniaxial MEMS-based projector. Considering that uniaxial MEMS-based projector does not utilize lenses for imaging and can only project single-direction fringe patterns, we regard each line in a fringe generated by the linear laser as an isophase plane. The intersection point of camera back-projection light ray and isophase plane directly establishes the mapping relationship between phase and spatial coordinates through rigorous mathematical derivation. Also, a flexible calibration method with no mechanical movement is proposed to achieve the mapping coefficients more simply by placing a specially designed planar target arbitrarily at different positions in the measurement space. Combined with these calibrated coefficients, 3D coordinates can be obtained by low-complexity polynomial calculation. In our experiments, a MEMS-based FPP system with a high-speed camera is constructed to realize high-accuracy and real-time 3D reconstruction.

2. Principle

2.1 Camera imaging model

For the camera in MEMS-based FPP system, the conventional pinhole model is adapted to present the imaging process in our method. The process is a perspective projection and it can be mathematically described as

$$\begin{cases} \mathbf{X_c}=\mathbf{R}\mathbf{X_w}+\mathbf{T} \\ \lambda\mathbf{\tilde{x}_c}=[\mathbf{I}|\mathbf{0}]\mathbf{\tilde{X}_c} \\ \mathbf{\tilde{m}_c}=\mathbf{K_c}\mathbf{\tilde{x}_c} \end{cases} \quad,\;\mathbf{K_c}= \begin{bmatrix} f_x & \alpha & C_x\\ 0 & f_y & C_y\\ 0 & 0 & 1 \end{bmatrix}$$
where $\tilde {\cdot }$ denotes a homogeneous coordinate; $\mathbf {X_w}=(X_w,Y_w,Z_w)^T$ is the world coordinates of an object point; $\mathbf {X_c}=(X_c,Y_c,Z_c)^{\mathrm {T}}$ is the camera coordinate coordinates of an object point; $\mathbf {R}$ is a rotation matrix and $\mathbf {T}$ is a translation vector, representing transformation from the world coordinate system (WCS) to the camera coordinate system (CCS); $\mathbf {x_c}=(x_c,y_c)^{\mathrm {T}}=(X_c/Z_c,Y_c/Z_c)^{\mathrm {T}}$ is the projection of $\mathbf {X_c}$ on the image plane (the image point denoted as $m_c=(u_c,v_c)^{\mathrm {T}}$ in an image coordinate system); $\lambda$ is a scale factor; $\mathbf {K_c}$ describes the intrinsic parameters of camera, including focus lengths $(f_x,f_y)^{\mathrm {T}}$, principle point $(C_x,C_y)^{\mathrm {T}}$ and the skew factor of image axes $\alpha$.

Due to lens distortion, there is a deviation between the actual image and the ideal image, and lens distortion needs to be taken into account. It can be modeled as $\mathbf {x_c}'=\mathbf {x_c}+\Delta (\mathbf {x_c})$

$$\Delta(\mathbf{x_c})= \begin{cases} x_c(k_1r^2+k_2r^4+k_3r^6)+[2p_1x_cy_c+p_2(r^2+2x_c^2)](1+p_3r^2+\cdots) \\ y_c(k_1r^2+k_2r^4+k_3r^6)+[p_1(r^2+2y_c^2)+2p_2x_cy_c](1+p_3r^2+\cdots)\\ \end{cases}$$
where $\mathbf {x_c}'=(x_c',y_c')^{\mathrm {T}}$ denotes a distorted image point, and $\Delta (\mathbf {x_c})$ is a distortion correction term, $r=\sqrt {x_c^2+y_c^2}$ is the distance from the undistorted image point to the principle point, $(k_1,k_2,k_3,\ldots )$ and $(p_1,p_2,p_3,\ldots )$ are radial and decentering distortion parameters, respectively. Three terms of radial distortion coefficients and two terms of decentering distortion coefficients can usually meet the accuracy requirements, and the lens distortion coefficient can be expressed by $\mathbf {k_c}=(k_1,k_2,p_1,p_2,k_3)^{\mathrm {T}}$.

2.2 MEMS-based phase to 3D mapping

A FPP system with a uniaxial MEMS-based projector system is shown in Fig. 1. In general, a uniaxial MEMS-based projector consists of a laser diode, a Powell lens and a MEMS mirror. The control system composed of FPGA (Field Programmable Gate Array) and ARM (Advanced RISC Machine) chip sends sinusoidal drive signals. Then the laser diode is modulated, and emits laser in a sinusoidal pattern. The laser is transformed into a linear light after the Powell lens, and the linear light will incident vertically to the center of the MEMS mirror. Simultaneously, the control system sends out square wave delay signals to synchronously control the MEMS mirror to scan at a certain frequency and the incident linear laser will be output into a sinusoidal fringe pattern in space.

 figure: Fig. 1.

Fig. 1. A FPP system with a uniaxial MEMS-based projector.

Download Full Size | PDF

In this paper, based on the uniqueness of uniaxial MEMS-based projector, a novel isophase plane model is proposed. Treating each line in a fringe generated by the linear laser as an isophase plane, we calculate 3D point through the intersection point of camera back-projection light ray and isophase plane and derive MEMS-based phase to 3D mapping.

As shown in Fig. 2, we take the center of the MEMS mirror as the origin and the scanning direction of the MEMS mirror as $X_p$ axis to establish a projector coordinate system (PCS). The coded structured light is generated by uniaxial MEMS-based projector, and each light stripe can be regarded as an isophase plane, which means that a phase determines an isophase plane. Obviously, each isophase plane passes through $Y_p$ axis.

 figure: Fig. 2.

Fig. 2. Schematic diagram of MEMS-based phase to 3D mapping.

Download Full Size | PDF

For a 3D point $\mathbf {X}$, denoted as $\mathbf {X_p}$ in the PCS, it lies on the ray $\mathbf {O_{p}X_{p}}$ and the ray is located on the isophase plane $I_s$. Assuming that there is a virtual plane $I_p$ at $Z_p=1$, the ray $\mathbf {O_{p}X_{p}}$ intersects the virtual plane $I_p$ at point $\mathbf {x_p}=(x_p,y_p,1)^{\mathrm {T}}$. The phase $\varphi _p$ is proportional to the isophase plane position $x_p$. Moreover, the phase of a point on the isophase plane is equal to the phase on the undistorted image plane of the camera, i.e., $\varphi _c=\varphi _p$. Therefore, there is a linear mapping from $\varphi _c$ to $x_p$ such that

$$x_p=\frac{\varphi_c}{2\pi}T_w$$
where $T_w$ represents stripe width.

The 3D point $\mathbf {X}$, denoted as $\mathbf {X_c}$ in the CCS, its ideal image point on the image plane is $\mathbf {m^{'}_c}$, and the actual image point is $\mathbf {m_c}$, which correspond to $\mathbf {x^{'}_c}$ and $\mathbf {x_c}$, respectively, on the normalized plane $I_n$. Due to lens distortion, correcting the distorted image point $\mathbf {x^{'}_c}$ to the ideal image point $\mathbf {x_c}$ can be a polynomial mapping. Thus, $\varphi _c$ can be a polynomial function of the distorted phase $\phi _c$ such that

$$\varphi_c=\sum_{n=0}^NA_n\phi_c^n$$
where $A_n$ is mapping coefficient and $N$ is polynomial order. In addition, we found that a 3th-degree polynomial can describe well.

The camera back-projection ray $\mathbf {O_{c}X_{c}}$ can be modeled as

$$s\begin{bmatrix} x_c\\ y_c\\ 1 \end{bmatrix}= \begin{bmatrix} X_c\\ Y_c\\ Z_c \end{bmatrix}$$

The plane equation can be formulized as

$$AX+BY+CZ+D=0$$

For the plane $I_s$ passing through 3D point $\mathbf {X_p}$, according to Eq. (6), the coefficients $(A,B,C,D)^{\mathrm {T}}$ can be described as $\mathbf {\Pi _p}=(1,0,-x_p,0)^{\mathrm {T}}$ in the PCS. According to the structural parameters between the camera and the projector, the plane $I_s$ in the CCS, namely $\mathbf {\Pi _c}$, can be represented as

$$\mathbf{\Pi_c}=\begin{bmatrix} \mathbf{R_s} & \mathbf{T_s}\\ \mathbf{0}^{\mathrm{T}} & 1\\ \end{bmatrix}^{\mathrm{-T}}\mathbf{\Pi_p}= \begin{bmatrix} r_{11}-r_{13}x_p\\ r_{21}-r_{23}x_p\\ r_{31}-r_{33}x_p\\ -(t_1r_{11}+t_2r_{21}+t_3r_{31})+x_p(t_1r_{13}+t_2r_{23}+t_3r_{33})\\ \end{bmatrix}$$
where $\mathbf {R_s}$ and $\mathbf {T_s}$ describe the rigid body transformation between CCS and PCS, named rotation matrix and translation vector, respectively. $r_{ij}$ and $t_i$ are elements of $\mathbf {R_s}$ and $\mathbf {T_s}$, respectively.

By combining Eqs. (5) and (7), 3D coordinates of $\mathbf {X_c}$ can be obtained exactly such that

$$\begin{cases} X_c=\frac{(t_1r_{13}+t_2r_{23}+t_3r_{33})x_cx_p-(t_1r_{11}+t_2r_{21}+t_3r_{31})x_c}{(r_{13}x_c+r_{23}y_c+r_{33})x_p-(r_{11}x_c+r_{21}y_c+r_{31})} \\ Y_c=\frac{(t_1r_{13}+t_2r_{23}+t_3r_{33})y_cx_p-(t_1r_{11}+t_2r_{21}+t_3r_{31})y_c}{(r_{13}x_c+r_{23}y_c+r_{33})x_p-(r_{11}x_c+r_{21}y_c+r_{31})} \\ Z_c=\frac{(t_1r_{13}+t_2r_{23}+t_3r_{33})x_p-(t_1r_{11}+t_2r_{21}+t_3r_{31})}{(r_{13}x_c+r_{23}y_c+r_{33})x_p-(r_{11}x_c+r_{21}y_c+r_{31})}\\ \end{cases}$$

For a fixed system and a specific image point, the parameters $\mathbf {R_s}$ and $\mathbf {T_s}$ and the image coordinates $x_c$ and $y_c$ are determined. Consequently, the spatial coordinates $X_c$, $Y_c$ and $Z_c$ are functions of $x_p$. Eq. (8) can be rewritten as

$$X_c=\frac{1}{a_0x_p+a_1}+a_X,Y_c=\frac{1}{b_0x_p+b_1}+b_Y,Z_c=\frac{1}{c_0x_p+c_1}+c_Z$$
where $\{a_0,a_1,a_X;b_0,b_1,b_Y;c_0,c_1,c_Z\}$ are linear mapping coefficients.

From Eqs. (3), (4) and (9), it can be derived that there is a reciprocal polynomial mapping between $\mathbf {X_c}$ and $\phi _c$, and it can be expressed as

$$X_c=\frac{1}{\sum\limits_{n=0}^Na_n\phi_c^n}+a_X,Y_c=\frac{1}{\sum\limits_{n=0}^Nb_n\phi_c^n}+b_Y,Z_c=\frac{1}{\sum\limits_{n=0}^Nc_n\phi_c^n}+c_Z$$
where $\{a_n,a_X;b_n,b_Y;c_n,c_Z\}$ are mapping coefficients which can be solved by the least square method. Therefore, 3D coordinates are only related to phase. As long as we calibrate the mapping coefficients in advance, 3D reconstruction can be realized quickly.

3. System calibration and 3D reconstruction

As shown in Fig. 3, planar targets are arbitrarily placed at different positions $\mathbf {\Pi _i}(i=1,2,\ldots,n)$, and it is necessary to ensure that the target placement positions cover the measurement space. The ray $l_i$ intersects with the planar target $\mathbf {\Pi _i}$ at 3D points $\mathbf {X_i}(i=1,2,\ldots,n)$, which are also located on the light plane formed by $Y_p$ axis and 3D points $\mathbf {X_i}$. Therefore, 3D points and corresponding phases can be obtained simultaneously without calibration of the structural parameters of projetcor and camera.

 figure: Fig. 3.

Fig. 3. Calibration of FPP system with a uniaxial MEMS-based projector.

Download Full Size | PDF

The parameters of each planar target in the CCS can be expressed as

$$\begin{cases} A=r_{13},B=r_{23},C=r_{33} \\ D=r_{13}t_1+r_{23}t_2+r_{33}t_3 \end{cases}$$
where $r_{ij}$ and $t_i$ are elements of the extrinsic parameters between the planar target and camera.

The ray $l_i$ passing through the camera’s optical center $\mathbf {O_c}$ and the point $\mathbf {x_i}$ can be represented by Eq. (5). 3D points $\mathbf {X_i}$ can be calculated by Eqs. (5) and (11). The absolute phase distributions of different planar targets are obtained by solving a series of images with fringe patterns. Thus, 3D coordinates $\mathbf {X_i}$ and $\phi _i$ are gained. Using Eq. (10), the mapping coefficient between $\mathbf {X_i}$ and $\phi _i$ can be calculated. Finally, the mapping coefficient look-up-table (LUT) is established and the mapping coefficient calibration is completed.

3D reconstruction can be performed very efficiently. First, project single-direction fringe patterns on the surface of the measured object and calculate the absolute phase distribution of the object, then determine mapping coefficients from LUT at each pixel position and the phase corresponding to the object, and finally calculate 3D coordinates of the object by Eq. (10). The overall flow chart of the calibration strategy and reconstruction method is shown in Fig. 4.

 figure: Fig. 4.

Fig. 4. The overall flow chart.

Download Full Size | PDF

4. Experiments and analysis

4.1 Calibration results

In order to verify the effectiveness and practicability of the proposed method, we developed a FPP system with a uniaxial MEMS-based projector consisting of an industrial camera (DEHENG MER-130-30UM) attached with a 12mm lens and a uniaxial MEMS-based projector (Ainstec BN6MF5851). The camera has a resolution of $1280\times 1024$ pixels. The projector consists of a diode laser with a wavelength of 854nm, a Powell lens and a MEMS mirror with a resonant frequency of 4000 Hz, which scans at an angle of 55 degrees.

We used a planar target with black circle benchmarks on a white background as a standard calibration reference shown in Fig. 5. The design of this target not only helps us determine the rigid body transformation between CCS and WCS, but also allows us to obtain high-quality phases. In our experiment, to cover the measuring volume of $300mm\times 400mm\times 150mm$, the planar target was placed at 12 positions. The planar target at each position was used for both camera calibration [63] and mapping coefficients calibration. The single-direction fringe patterns were projected on the planar target. Camera calibration was completed by the planar target images, and the absolute phase distributions of the planar target were calculated by a series of images with fringe patterns. The rigid-body transformation between WCS and CCS was obtained by camera calibration, and the plane parametric equations of each planar target in the camera coordinate system was calculated. Then we calculated the intersection points of the light ray passing through each pixel and 12 planar targets. We set a threshold to determine whether the number of intersection points was sufficient to solve mapping coefficients. Finally, the mapping coefficients were calculated by Eq. (10) (i.e., a reciprocal polynomial) and LUT was established. Similarly, we calibrate the mapping coefficients using Zhang’s method [62] (i.e., a polynomial).

 figure: Fig. 5.

Fig. 5. Schematic diagram of planar target

Download Full Size | PDF

The results of calibrated parameters are listed in Table 1. To estimate the polynomial fitting error, the error of a certain point in a certain direction is that 3D coordinates calculated by the polynomial deviate from the intersection points obtained by the back-projection ray and the planar target at different positions. Figures 6(a) and 6(b) shows the Z error distribution of different orders by reciprocal polynomial and polynomial, respectively. The mean errors of different orders by reciprocal polynomial and polynomial are listed in Table 2. From Fig. 6 and Table 2, the fitting results of two methods are similar when the fitting order is 3, 4, or 5. As the order is smaller or larger, the fitting error of polynomial has a larger jump, while the jump is relatively small by reciprocal polynomial. It means our model is more suitable for FPP system with a MEMS-based projector. Both methods can use 3th order to obtain high accuracy with fewer coefficients, and in scenes that require less precision, a 2nd-order reciprocal polynomial can achieve 3D reconstruction by our method. In addition, the planar target is moved along the Z direction when calibrating the mapping coefficients. Therefore, the intersection points of the camera back-projection ray and the planar target at different positions vary greater in the Z direction than in the X or Y direction, which leads to the error of Z larger than X or Y in Table 2. The 3th-order reciprocal polynomial fitting curve of pixel (582,844) is shown in Fig. 7.

 figure: Fig. 6.

Fig. 6. The error distribution of different orders. (a) The error distribution of reciprocal polynomial; (b) The error distribution of polynomial.

Download Full Size | PDF

 figure: Fig. 7.

Fig. 7. The 3th degree reciprocal polynomial fitting of result at pixel (582,844). (a) a reciprocal polynomial mapping curve of X coordinate; (b) a reciprocal polynomial mapping curve of Y coordinate; (c) a reciprocal polynomial mapping curve of Z coordinate.

Download Full Size | PDF

Tables Icon

Table 1. Parameters of camera calibration

Tables Icon

Table 2. Mean errors of different orders

4.2 Reconstruction

After the calibration was completed, 3D reconstruction can be achieved in an efficient and high-precision way. To evaluate the performance of our method, we reconstructed a standard plane. The point clouds of the reconstructed plane are used to fit the ideal plane by Eq. (6). The coefficients $\{A,B,C,D\}$ can be calculated by the least square method. We calculated the distance from each point to the ideal plane by Eq. (12) as the error of each point.

$$dis=\frac{|AX+BY+CZ+D|}{\sqrt{A^2+B^2+C^2}}$$

Using Eq. (12) and Eq. (13), the standard deviation between the reconstructed point clouds and the ideal plane can be measured.

$$\sigma=\sqrt{\frac{\sum\limits_{i=1}^n dis_i^2}{n}}$$

The results of reconstructed standard plane and error distribution are shown in Fig. 8. As shown in Fig. 8(d), the standard deviation (STD) of the reconstructed plane is 0.063mm.

 figure: Fig. 8.

Fig. 8. The result of reconstructed standard plane and error distribution. (a) The measured standard plane; (b) The measured standard plane with stripe; (c) The reconstructed plane with our method; (d) The error distribution with our method, STD: 0.063mm.

Download Full Size | PDF

To further evaluate the accuracy of our method, we reconstructed a set of standard spheres. The point clouds of the reconstructed sphere are used to fit ideal spherical surface by Eq. (14).

$$(x-x_0)^2+(y-y_0)^2+(z-z_0)^2=R^2$$
where $\{x_0,y_0,z_0\}$ and $R$ are the center and radius of the fitting sphere, respectively, which can be calculated by the least square method. The error can be evaluated by the deviation of different fitting spherical center distance.

The results of reconstructed standard spheres and error distribution are shown in Fig. 9. There are ten fitting spherical surfaces, of which every two fitting spherical surfaces are a combination, so a total of 45 fitting spherical center distances are calculated. Figure 9(d) shows the error distribution of all combinations. The MAX and MEAN of different fitting spherical center distance are 0.1150mm and 0.0426mm, respectively. From Fig. 9(d), we can observe that the deviation of different fitting spherical center distance reconstructed by our method is basically below 0.1mm. We also calculate the fitting deviation of the standard sphere in the red box in Fig. 9(a). The error distribution of a fitting spherical surface is shown in Fig. 9(e), and the STD is 0.0413mm.

 figure: Fig. 9.

Fig. 9. The result of reconstructed standard spheres and error distribution. (a) The measured standard spheres; (b) The measured standard spheres with stripe; (c) The reconstructed spheres with our method; (d) The error distribution; (e) The error distribution of a fitting spherical surface, STD: 0.0413mm.

Download Full Size | PDF

Finally, we reconstructed a complex 3D-printed model as shown in Fig. 10. The reconstructed results using our method is shown in Fig. 10(c). We can restore the details of the object, such as beard, hair, etc.

 figure: Fig. 10.

Fig. 10. The result of reconstructed model. (a) The measured 3D-printed model; (b) The measured 3D-printed model with stripe; (c) The reconstructed model.

Download Full Size | PDF

Presently, the experiments illustrate that the proposed method can obtain high-accuracy 3D model. To verify the performance of our method for high-speed applications, we set up a high-speed FPP system consisting of a Phantom camera (VEO E-310L) with a 24 mm lens and a uniaxial MEMS-based projector (Ainstec BN6MF5851), as shown in Fig. 11. The image resolutions for the camera is $1024\times 768$. The theoretically highest refreshing rate of the projector is more than 4000 fps. In this article, to ensure the stability of the measurement, the scanning rate of projector is limited to 1000 fps and the camera’s acquisition speed is 1000 fps. The computer is equipped with 72 CPUs (Inter Xeon Gold 5220 CPU @ 2.20 GHz 2.19 GHz) and a graphics card (NVIDIA TITAN RTX) with 4608 CUDA cores. We adopted a four-step phase-shifted technique with a fringe period of 64 and 7-bit gray code strategy [64] to determine unwrapping phase which means that 11 patterns are required to recover 3D shape. Under the parallel processing of 72 CPUs, the phase solution time and 3D reconstruction time are 10 ms and 1 ms, respectively. With the acceleration of GPU, the phase solution time and 3D reconstruction time are 3 ms and 1 ms, respectively. The final achieved reconstructed rate is about 90 fps (i.e., real time).

 figure: Fig. 11.

Fig. 11. The high-speed FPP system

Download Full Size | PDF

We measured two dynamic scenes, including a dynamic scene of collapsing wooden blocks (Visualization 1) and a dynamic scene of a swinging ping-pong ball (Visualization 2), as shown in Fig. 12 and Fig. 13, respectively. Figure 12(a) shows the scene of the collapsing wooden blocks at certain moments while Fig. 12(b) shows the corresponding 3D scenes. Figure 13(a) and Fig. 13(b) show the scene of the swinging ping-pong ball at certain moments and the corresponding 3D scenes, respectively.

 figure: Fig. 12.

Fig. 12. Measurement of collapsing wooden blocks. (a) The collapsing wooden blocks; (b) Corresponding 3D scenes.

Download Full Size | PDF

 figure: Fig. 13.

Fig. 13. Measurement of a swinging ping-pong ball. (a) The swinging ping-pong ball; (b) Corresponding 3D scenes.

Download Full Size | PDF

5. Conclusions

In this paper, we proposed a high-efficiency 3D reconstruction method for FPP system with a uniaxial MEMS-based projector that is lensless and can only project single-direction fringe patterns. Treating each line in a fringe produced by the linear laser as an isophase plane, we mathematically derived that the intersection point of camera back-projection light ray and isophase plane directly established the mapping relationship between phase and spatial 3D coordinates. In addition, a flexible calibration method was introduced and a novel planar target was designed accordingly. The mapping coefficients can be easily calibrated by placing the planar target arbitrarily at different positions in the measurement space. Our method took full advantage of the high frame-rate of MEMS and realized real-time and accurate 3D reconstruction successfully. Experiments verified the effectiveness of our method.

Funding

National Natural Science Foundation of China (61875137); The Fundamental Research Project of Shenzhen Municipality (JCYJ20190808153201654); The Sino-German Cooperation Group (GZ 1391); The key Laboratory of Intelligent Optical Metrology and Sensing of Shenzhen (ZDSYS20200107103001793).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. F. Chen, G. M. Brown, and M. Song, “Overview of three-dimensional shape measurement using optical methods,” Opt. Eng. 39, 1 (1999). [CrossRef]  

2. S. S. Gorthi and P. Rastogi, “Fringe projection techniques: Whither we are?” Opt. Lasers Eng. 48(2), 133–140 (2010). [CrossRef]  

3. A. G. Marrugo, F. Gao, and S. Zhang, “State-of-the-art active optical techniques for three-dimensional surface metrology: a review [invited],” J. Opt. Soc. Am. A 37(9), B60–B77 (2020). [CrossRef]  

4. S. Zhang, “Recent progresses on real-time 3d shape measurement using digital fringe projection techniques,” Opt. Lasers Eng. 48(2), 149–158 (2010). [CrossRef]  

5. Z. Cai, X. Liu, X. Peng, Y. Yin, A. Li, J. Wu, and B. Z. Gao, “Structured light field 3d imaging,” Opt. Express 24(18), 20324–20334 (2016). [CrossRef]  

6. C. Zuo, S. Feng, L. Huang, T. Tao, W. Yin, and Q. Chen, “Phase shifting algorithms for fringe projection profilometry: A review,” Opt. Lasers Eng. 109, 23–59 (2018). [CrossRef]  

7. Z. Cai, X. Liu, X. Peng, and B. Z. Gao, “Ray calibration and phase mapping for structured-light-field 3d reconstruction,” Opt. Express 26(6), 7598 (2018). [CrossRef]  

8. L. Zhang, Q. Chen, C. Zuo, and S. Feng, “Real-time high dynamic range 3d measurement using fringe projection,” Opt. Express 28(17), 24363 (2020). [CrossRef]  

9. J. Han, H. Yu, D. Zheng, J. Fu, and C. Zuo, “Deep learning-based fringe modulation-enhancing method for accurate fringe projection profilometry,” Opt. Express 28(15), 21692–21703 (2020). [CrossRef]  

10. Z. Wang, Y. Yang, X. Liu, Y. Miao, Q. Hou, Y. Yin, Z. Cai, Q. Tang, and X. Peng, “Light-field-assisted phase unwrapping of fringe projection profilometry,” IEEE Access 9, 49890–49900 (2021). [CrossRef]  

11. Y. Yang, Q. Hou, Y. Li, Z. Cai, X. Liu, J. Xi, and X. Peng, “Phase error compensation based on tree-net using deep learning,” Opt. Lasers Eng. 143, 106628 (2021). [CrossRef]  

12. S. Heist, P. Lutzke, I. Schmidt, P. Dietrich, P. Kühmstedt, A. Tünnermann, and G. Notni, “High-speed three-dimensional shape measurement using gobo projection,” Opt. Lasers Eng. 87, 90–96 (2016). [CrossRef]  

13. S. Zhang, “High-speed 3d shape measurement with structured light methods: A review,” Opt. Lasers Eng. 106, 119–131 (2018). [CrossRef]  

14. Z. Wu, W. Guo, Y. Li, Y. Liu, and Q. Zhang, “High-speed and high-efficiency three-dimensional shape measurement based on gray-coded light,” Photonics Research (2020).

15. S. Lei and S. Zhang, “Flexible 3-d shape measurement using projector defocusing,” Opt. Lett. 34(20), 3080–3082 (2009). [CrossRef]  

16. S. Zhang, “Flexible 3d shape measurement using projector defocusing: extended measurement range,” Opt. Lett. 35(7), 934–936 (2010). [CrossRef]  

17. Y. Wang and S. Zhang, “Three-dimensional shape measurement with binary dithered patterns,” Appl. Opt. 51(27), 6631–6636 (2012). [CrossRef]  

18. Q. Zhang and X. Su, “High-speed optical measurement for the drumhead vibration,” Opt. Express 13(8), 3110–3116 (2005). [CrossRef]  

19. L. Kai, Y. Wang, D. L. Lau, Q. Hao, and L. G. Hassebrook, “Dual-frequency pattern scheme for high-speed 3-d shape measurement,” Opt. Express 18(5), 5229–5244 (2010). [CrossRef]  

20. C. Zuo, Q. Chen, G. Gu, S. Feng, and F. Feng, “High-speed three-dimensional profilometry for multiple objects with complex shapes,” Opt. Express 20(17), 19493–19510 (2012). [CrossRef]  

21. C. Zuo, Q. Chen, G. Gu, S. Feng, F. Feng, R. Li, and G. Shen, “High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection,” Opt. Lasers Eng. 51(8), 953–960 (2013). [CrossRef]  

22. K. Song, S. Hu, X. Wen, and Y. Yan, “Fast 3d shape measurement using fourier transform profilometry without phase unwrapping,” Opt. Lasers Eng. 84, 74–81 (2016). [CrossRef]  

23. T. Tao, Q. Chen, J. Da, S. Feng, Y. Hu, and C. Zuo, “Real-time 3-d shape measurement with composite phase-shifting fringes and multi-view system,” Opt. Express 24(18), 20253–20269 (2016). [CrossRef]  

24. T. Tao, Q. Chen, S. Feng, J. Qian, Y. Hu, L. Huang, and C. Zuo, “High-speed real-time 3d shape measurement based on adaptive depth constraint,” Opt. Express 26(17), 22440 (2018). [CrossRef]  

25. Z. Wu, C. Zuo, W. Guo, T. Tao, and Q. Zhang, “High-speed three-dimensional shape measurement based on cyclic complementary gray-code light,” Opt. Express 27(2), 1283 (2019). [CrossRef]  

26. Z. Cai, X. Liu, A. Li, Q. Tang, X. Peng, and B. Z. Gao, “Phase-3d mapping method developed from back-projection stereovision model for fringe projection profilometry,” Opt. Express 25(2), 1262–1277 (2017). [CrossRef]  

27. Q. C. Zhang and X. Y. Su, “An optical measurement of vortex shape at a free surface,” Opt. Laser Technol. 34(2), 107–113 (2002). [CrossRef]  

28. Q. Zhang, X. Su, Y. Cao, Y. Li, L. Xiang, and W. Chen, “Optical 3-d shape and deformation measurement of rotating blades using stroboscopic structured illumination,” Opt. Eng. 44(11), 113601 (2005). [CrossRef]  

29. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-d object shapes,” Appl. Opt. 22(24), 3977 (1983). [CrossRef]  

30. W. Lohry and S. Zhang, “High-speed absolute three-dimensional shape measurement using three binary dithered patterns,” Opt. Express 22(22), 26752–26762 (2014). [CrossRef]  

31. J.-S. Hyun, G. T.-C. Chiu, and S. Zhang, “High-speed and high-accuracy 3d surface measurement using a mechanical projector,” Opt. Express 26(2), 1474–1487 (2018). [CrossRef]  

32. Y. Wang and S. Zhang, “Superfast multifrequency phase-shifting technique with optimal pulse width modulation,” Opt. Express 19(6), 5149–5155 (2011). [CrossRef]  

33. Y. Wang, J. I. Laughner, I. R. Efimov, and S. Zhang, “3d absolute shape measurement of live rabbit hearts with a superfast two-frequency phase-shifting technique,” Opt. Express 21(5), 5822–5832 (2013). [CrossRef]  

34. Z. Wu, W. Guo, and Q. Zhang, “High-speed three-dimensional shape measurement based on shifting gray-code light,” Opt. Express 27(16), 22631–22644 (2019). [CrossRef]  

35. T. Tao, Q. Chen, S. Feng, Y. Hu, M. Zhang, and C. Zuo, “High-precision real-time 3d shape measurement based on a quad-camera system,” J. Opt. 20(1), 014009 (2018). [CrossRef]  

36. H. Urey, S. T. S. Holmstrm, and U. Baran, “Mems laser scanners: a review,” J. Microelectromech. Syst. 23(2), 259–275 (2014). [CrossRef]  

37. G. Hu, X. Zhou, G. Zhang, C. Zhang, D. Li, and G. Wang, “Multiple laser stripe scanning profilometry based on microelectromechanical systems scanning mirror projection,” Micromachines 10(1), 57 (2019). [CrossRef]  

38. S. Zhang and P. Huang, “Novel method for structured light system calibration,” Opt. Eng. 45(8), 083601 (2006). [CrossRef]  

39. Ricardo Legarda-S’enz, “Accurate procedure for the calibration of a structured light system,” Opt. Eng. 43(2), 464 (2004). [CrossRef]  

40. X. Chen, J. Xi, J. Ye, and S. Jin, “Accurate calibration for a camera–projector measurement system based on structured light projection,” Opt. Lasers Eng. 47(3-4), 310–319 (2009). [CrossRef]  

41. Y. Yin, X. Peng, A. Li, X. Liu, and B. Z. Gao, “Calibration of fringe projection profilometry with bundle adjustment strategy,” Opt. Lett. 37(4), 542–544 (2012). [CrossRef]  

42. Z. Huang, J. Xi, Y. Yu, and Q. Guo, “Accurate projector calibration based on a new point-to-point mapping relationship between the camera and projector images,” Appl. Opt. 54(3), 347–356 (2015). [CrossRef]  

43. Chen Rui, Xu Jing, Jianhua Heping, Zonghua Su, and Ken Zhang, “Accurate calibration method for camera and projector in fringe patterns measurement system,” Appl. Opt. 55(16), 4293–4300 (2016). [CrossRef]  

44. X. Liu, Z. Cai, Y. Yin, H. Jiang, D. He, W. He, Z. Zhang, and X. Peng, “Calibration of fringe projection profilometry using an inaccurate 2d reference target,” Opt. Lasers Eng. 89, 131–137 (2017). [CrossRef]  

45. V. Srinivasan, H. C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-d diffuse objects,” Appl. Opt. 23(18), 3105 (1984). [CrossRef]  

46. D. Huynh, R. Owens, and P. Hartmann, “Calibrating a structured light stripe system: A novel approach,” Int. J. Comput. Vis. 33(1), 73–86 (1999). [CrossRef]  

47. H. Liu, W. H. Su, K. Reichard, and S. Yin, “Calibration-based phase-shifting projected fringe profilometry for accurate absolute 3d surface profile measurement,” Opt. Commun. 216(1-3), 65–80 (2003). [CrossRef]  

48. Z. Zhang, D. Zhang, and X. Peng, “Performance analysis of a 3d full-field sensor based on fringe projection,” Opt. Lasers Eng. 42(3), 341–353 (2004). [CrossRef]  

49. J. Vargas, J. A. Quiroga, and M. Terron-Lopez, “Flexible calibration procedure for fringe projection profilometry,” Optical Engineering (2007).

50. H. Du and Z. Wang, “Three-dimensional shape measurement with an arbitrarily arranged fringe projection profilometry system,” Opt. Lett. 32(16), 2438–2440 (2007). [CrossRef]  

51. Y. Xiao, Y. Cao, and Y. Wu, “Improved algorithm for phase-to-height mapping in phase measuring profilometry,” Appl. Opt. 51(8), 1149 (2012). [CrossRef]  

52. Z. Huang, J. Xi, Y. Yu, Q. Guo, and L. Song, “Improved geometrical model of fringe projection profilometry,” Opt. Express 22(26), 32220 (2014). [CrossRef]  

53. F. Da and S. Gai, “Flexible three-dimensional measurement technique based on a digital light processing projector,” Appl. Opt. 47(3), 377–385 (2008). [CrossRef]  

54. Z. Wei, L. Cao, and G. Zhang, “A novel 1d target-based calibration method with unknown orientation for structured light vision sensor,” Opt. Laser Technol. 42(4), 570–574 (2010). [CrossRef]  

55. J. Huang and Q. Wu, “A new reconstruction method based on fringe projection of three-dimensional measuring system,” Opt. Lasers Eng. 52, 115–122 (2014). [CrossRef]  

56. Q. Sun, Y. Hou, Q. Tan, and G. Li, “A flexible calibration method using the planar target with a square pattern for line structured light vision system,” PLoS One 9(9), e106911 (2014). [CrossRef]  

57. L. Yu, W. Zhang, W. Li, C. Pan, and H. Xia, “Simplification of high order polynomial calibration model for fringe projection profilometry,” Meas. Sci. Technol. 27(10), 105202 (2016). [CrossRef]  

58. X. Li, Z. Zhang, and C. Yang, “Reconstruction method for fringe projection profilometry based on light beams,” Appl. Opt. 55(34), 9895 (2016). [CrossRef]  

59. V. Suresh, J. Holton, and B. Li, “Structured light system calibration with unidirectional fringe patterns,” Opt. Lasers Eng. 106, 86–93 (2018). [CrossRef]  

60. X. Lu, Q. Wu, and H. Huang, “Calibration based on ray-tracing for multi-line structured light projection system,” Opt. Express 27(24), 35884 (2019). [CrossRef]  

61. D. Yang, D. Qiao, and C. Xia, “Curved light surface model for calibration of a structured light 3d modeling system based on striped patterns,” Opt. Express 28(22), 33240–33253 (2020). [CrossRef]  

62. S. Zhang, “Flexible and high-accuracy method for uni-directional structured light system calibration,” Opt. Lasers Eng. 143, 106637 (2021). [CrossRef]  

63. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Machine Intell. 22(11), 1330–1334 (2000). [CrossRef]  

64. Q. Zhang, X. Su, L. Xiang, and X. Sun, “3-d shape measurement based on complementary gray-code light,” Opt. Lasers Eng. 50(4), 574–579 (2012). [CrossRef]  

Supplementary Material (2)

NameDescription
Visualization 1       Reconstruct a dynamic scene of collapsing wooden blocks and play it at 1.0x
Visualization 2       Reconstruct a dynamic scene of swinging ping-pong ball and play it at 0.5x speed

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (13)

Fig. 1.
Fig. 1. A FPP system with a uniaxial MEMS-based projector.
Fig. 2.
Fig. 2. Schematic diagram of MEMS-based phase to 3D mapping.
Fig. 3.
Fig. 3. Calibration of FPP system with a uniaxial MEMS-based projector.
Fig. 4.
Fig. 4. The overall flow chart.
Fig. 5.
Fig. 5. Schematic diagram of planar target
Fig. 6.
Fig. 6. The error distribution of different orders. (a) The error distribution of reciprocal polynomial; (b) The error distribution of polynomial.
Fig. 7.
Fig. 7. The 3th degree reciprocal polynomial fitting of result at pixel (582,844). (a) a reciprocal polynomial mapping curve of X coordinate; (b) a reciprocal polynomial mapping curve of Y coordinate; (c) a reciprocal polynomial mapping curve of Z coordinate.
Fig. 8.
Fig. 8. The result of reconstructed standard plane and error distribution. (a) The measured standard plane; (b) The measured standard plane with stripe; (c) The reconstructed plane with our method; (d) The error distribution with our method, STD: 0.063mm.
Fig. 9.
Fig. 9. The result of reconstructed standard spheres and error distribution. (a) The measured standard spheres; (b) The measured standard spheres with stripe; (c) The reconstructed spheres with our method; (d) The error distribution; (e) The error distribution of a fitting spherical surface, STD: 0.0413mm.
Fig. 10.
Fig. 10. The result of reconstructed model. (a) The measured 3D-printed model; (b) The measured 3D-printed model with stripe; (c) The reconstructed model.
Fig. 11.
Fig. 11. The high-speed FPP system
Fig. 12.
Fig. 12. Measurement of collapsing wooden blocks. (a) The collapsing wooden blocks; (b) Corresponding 3D scenes.
Fig. 13.
Fig. 13. Measurement of a swinging ping-pong ball. (a) The swinging ping-pong ball; (b) Corresponding 3D scenes.

Tables (2)

Tables Icon

Table 1. Parameters of camera calibration

Tables Icon

Table 2. Mean errors of different orders

Equations (14)

Equations on this page are rendered with MathJax. Learn more.

{ X c = R X w + T λ x ~ c = [ I | 0 ] X ~ c m ~ c = K c x ~ c , K c = [ f x α C x 0 f y C y 0 0 1 ]
Δ ( x c ) = { x c ( k 1 r 2 + k 2 r 4 + k 3 r 6 ) + [ 2 p 1 x c y c + p 2 ( r 2 + 2 x c 2 ) ] ( 1 + p 3 r 2 + ) y c ( k 1 r 2 + k 2 r 4 + k 3 r 6 ) + [ p 1 ( r 2 + 2 y c 2 ) + 2 p 2 x c y c ] ( 1 + p 3 r 2 + )
x p = φ c 2 π T w
φ c = n = 0 N A n ϕ c n
s [ x c y c 1 ] = [ X c Y c Z c ]
A X + B Y + C Z + D = 0
Π c = [ R s T s 0 T 1 ] T Π p = [ r 11 r 13 x p r 21 r 23 x p r 31 r 33 x p ( t 1 r 11 + t 2 r 21 + t 3 r 31 ) + x p ( t 1 r 13 + t 2 r 23 + t 3 r 33 ) ]
{ X c = ( t 1 r 13 + t 2 r 23 + t 3 r 33 ) x c x p ( t 1 r 11 + t 2 r 21 + t 3 r 31 ) x c ( r 13 x c + r 23 y c + r 33 ) x p ( r 11 x c + r 21 y c + r 31 ) Y c = ( t 1 r 13 + t 2 r 23 + t 3 r 33 ) y c x p ( t 1 r 11 + t 2 r 21 + t 3 r 31 ) y c ( r 13 x c + r 23 y c + r 33 ) x p ( r 11 x c + r 21 y c + r 31 ) Z c = ( t 1 r 13 + t 2 r 23 + t 3 r 33 ) x p ( t 1 r 11 + t 2 r 21 + t 3 r 31 ) ( r 13 x c + r 23 y c + r 33 ) x p ( r 11 x c + r 21 y c + r 31 )
X c = 1 a 0 x p + a 1 + a X , Y c = 1 b 0 x p + b 1 + b Y , Z c = 1 c 0 x p + c 1 + c Z
X c = 1 n = 0 N a n ϕ c n + a X , Y c = 1 n = 0 N b n ϕ c n + b Y , Z c = 1 n = 0 N c n ϕ c n + c Z
{ A = r 13 , B = r 23 , C = r 33 D = r 13 t 1 + r 23 t 2 + r 33 t 3
d i s = | A X + B Y + C Z + D | A 2 + B 2 + C 2
σ = i = 1 n d i s i 2 n
( x x 0 ) 2 + ( y y 0 ) 2 + ( z z 0 ) 2 = R 2
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.