Abstract
360 degrees (360°) digitalization of three dimensional (3D) solids using a projected light-strip is a well-established technique in academic and commercial profilometers. These profilometers project a light-strip over the digitizing solid while the solid is rotated a full revolution or 360-degrees. Then, a computer program typically extracts the centroid of this light-strip, and by triangulation one obtains the shape of the solid. Here instead of using intensity-based light-strip centroid estimation, we propose to use Fourier phase-demodulation for 360° solid digitalization. The advantage of Fourier demodulation over strip-centroid estimation is that the accuracy of phase-demodulation linearly-increases with the fringe density, while in strip-light the centroid-estimation errors are independent. Here we proposed first to construct a carrier-frequency fringe-pattern by closely adding the individual light-strip images recorded while the solid is being rotated. Next, this high-density fringe-pattern is phase-demodulated using the standard Fourier technique. To test the feasibility of this Fourier demodulation approach, we have digitized two solids with increasing topographic complexity: a Rubik's cube and a plastic model of a human-skull. According to our results, phase demodulation based on the Fourier technique is less noisy than triangulation based on centroid light-strip estimation. Moreover, Fourier demodulation also provides the amplitude of the analytic signal which is a valuable information for the visualization of surface details.
© 2016 Optical Society of America
1. Introduction
Fringe projection profilometry of three-dimensional (3D) solids using Fourier phase-demodulation is well known since the classical paper by Takeda et al. in 1982 [1]. Although this phase-measuring technique is effective for 3D digitalization, it cannot digitize the 360 degrees (360°) view of the solid. In order to do this, one needs to position the solid over a turntable to have access it from all 360° perspectives. As far as we know, 360° profilometer was first published in 1985 by Halioua et al. [2], who used a linear fringe-pattern projection coupled with 3-step phase shifter and a turntable to obtain the 360° profilometry of a human mannequin head [2]. Six years later, in 1991, Cheng et al. [3] reported a 360° digitalization approach using an artificial-neural network projecting a laser-strip over the solid lying over a 360° rotary-stage. In 1994 Asundi et al. [4] published a fast 360° technique based on a stroboscope light-strip projection and a digital drum camera rotating the solid at 2500 rpm. Also Chang et al. [5] used a light-strip from a laser diode and automatically reconstructed a solid with 360° using a neural network to estimate the centroid of the light-strips. Gomes et al. [6] used a projected full-image linear-grating to study the human trunk looking for spinal deformities; they used Fourier profilometry for their purpose since light-strip 360° profilometers are less accurate than Fourier phase demodulation of high-density carrier fringe-patterns [1–3,6]. Later on Song et al. [7] used a full-image linear-fringes and phase-shifting interferometry of a rotating object for 360° profilometry. Asundi et al. [8] also implemented a very fast 360° profilometer using a time delay integration imaging for solid digitalization. The state of the art on general 3D profilometry was reviewed in 2001 by Sue and Chen [9], but they have just included a single paper of 360° profilometry. One year later Zhang et al. [10] used 360° profilometry for flow analysis in mechanical engines. In 2005 Munoz-Rodriguez et al. used triangulation for 3D object reconstruction by projecting a light-strip and Hu moments [11]. In 2008, Trujillo-Shiaffino et al. [12] used 3D profilometry based on light-strip projection and centroid triangulation to digitize a smooth rotation-symmetric object. In 2013 Shi et al. used 360° fringe projection profilometry applied to fluorescent molecular tomography [13]. Some researchers have used shearing interferometry to project high quality linear fringes for 360° profilometry with good results [14]. In 2015, Long et al. [16] used light-strips to synthesize a carrier fringe-pattern image for (non 360°) 3D profilometry. Bianco et al. [17,18] also used a carrier-frequency fringe-pattern synthesis similar to that herein used, but applied to optical interferometric microscopy instead of 360° profilometry. Finally, we just posted the theoretical principles (without experiments) of the herein proposed 360° technique in the arXiv repository [19]. This is a brief, self-contained, view of the main ideas behind 360° profilometry to this date. The relation of our proposal with previous 360° profilometers will be discussed in a more detailed way in Appendix A at the end of this paper.
We have tested our light-strip Fourier phase-demodulation technique by digitizing two more complex solids than previously reported [2–15]. We have digitized a Rubik’s cube and a plastic human-skull. The plastic skull is more difficult because it is a discontinuous non-convex 3D-surface with deep holes (the eye basins) and high-frequency surface details. By digitizing these two solids we demonstrate that our 360°, Fourier phase-demodulation algorithm is more robust, less noisy and reveal more surface details than previous 360° profilometers [2–14]. A Rubik’s cube was previously digitized using 4-step phase-shifting 360° profilometry by the same authors [15], obtaining about the same results as shown here but requiring 4 solid full-revolutions (4360°). In other words, our previously reported technique generates fringe patterns without carrier and needs four times more data and time to obtain about the same experimental results [15]. Also note that we are not discussing camera-projector calibration issues or using non-standard phase-demodulating and/or unwrapping algorithms because we do not require any new strategies on these topics [2–15]. Finally we want to remark that previous 360° profilometers (except Ref [15].) have mainly concentrated in digitizing very smooth, quasi-cylindrical objects [2–14]. This is because smooth, quasi-cylindrical 3D surfaces have fewer details and are far easier to digitize.
1.1 Well-known fringe and light-strip projection 360° profilometers
Here we review the 360° degrees profilometers for linear-fringe and light-strip projection reported to this date [2–15]. Figure 1 shows the three configurations for light projection over a solid-sphere lying over a 360° rotary-stage. For clarity we picture a sphere but our 360° technique applies to far more complicated solids.
In Fig. 1(a), a fringe projector in horizontal configuration is aimed towards the solid having sensitivity proportional to [2–4,8]. Note that the lower part of the object remains occluded during the 360° revolution; In our opinion, this is a significant drawback. In Fig. 1(b), the projected fringes are vertical and the digitizing camera is located on the plane with an azimuthal angle-sensitivity. The setup in Fig. 1(b) allows projecting open fringes over the entire object. Roughly speaking there were two alternatives for 360° digitizing 3D solids: full-image Fourier phase-demodulation from several azimuthal angles [11–13], or single vertical pixel-column using phase-stepping techniques as in [15]. When using the Fourier method, a full-frame open-fringes pattern was captured and just the vertical pixel-column for each angle is kept (Fig. 1(a)); afterwards these pixel columns were stitched together to construct a fringe-image of the object in cylindrical coordinates [11–13]. In reference [15], we demonstrated a lower complexity alternative where one needs to record just the central camera-pixel-column of the fringe-pattern for each azimuthal step (Fig. 1(b)). However, this approach generates non-carrier fringe-images [15], so it requires phase-stepping techniques. Therefore the digitizing solid has to be rotated 4 full-revolutions [15]. The experimental setup shown in Fig. 1(c) is the one used in this paper for our 360° profilometer and it will be discussed next.
2. Theoretical background
The light-strip (in panel 1(c)) as imaged over the CCD camera plane may be modeled by the following Gaussian irradiance:
where is the solid being digitized in cylindrical coordinates. The functionis the ambient-light which is normally turned-off ; the function is the strip-light intensity. The light-strip is phase-modulated by for each rotation-step [19]. The window indicator function may be expressed as (see Fig. 2),having N light-strip images separated by rotation-steps, they are summed to synthesize the following fringe-pattern as (see Fig. 2),where we have used to pass from the coordinate system to in Eq. (3). As Fig. 2 shows, this sum of displaced Gaussian-strips forms a synthesized carrier-frequency fringe-pattern that can be modeled approximately as,where and are the background and contrast functions of the carrier-fringes respectively. The modulated phase is proportional to the radius of the object and the linear-carrier frequency,while depends on the Gaussian-strips separation in (see Fig. 2). We have found experimentally that optimal Gaussian separation reduces the higher harmonics of .The searched phase in Eq. (4) can be demodulation using the Fourier technique [1]. Rewriting in Eq. (4) using the exponential notation one obtains,
taking the Fourier transform of this signal, one obtains the following spectrum:where is the Fourier spectral space of the image . The parameter is chosen to have enough separation among the spectral lobes in Eq. (7). After displacing to the origin, and using the inverse Fourier transform one obtains [20],the searched wrapped-phase can be computed as the angle of this analytic signal:The recovered amplitude can be used to generate a binary function to exclude the self-occluding shadows and/or regions of poor fringe contrast as,Finally, we want to remark that any light-strip 360-degrees profilometer has a well-known limitation when discontinuous solids are digitized. Figure 3 shows schematically this limitation for a dented solid where the blue zones represent self-generated discontinuities shadows cast by the 3D surface. At these shadows the light-strip amplitude drops to zero and the solid is undefined. The only way of having no shadows would be that the camera and the line-projector would share the same optical axis (see Fig. 3), dropping the profilometer’ sensitivity to zero i.e. .3. Experimental results
To test our proposed light-strip Fourier demodulation 360° profilometer, we have digitized two solids with increasing topographic complexity: a Rubik's cube and a plastic model of a human-skull (see Fig. 4). Our experimental setup is described in Fig. 1(c). The azimuthal rotation was controlled with 400 steps per revolution, so; as light-strip projector we used a digital light projector, and the CCD camera has VGA resolution . The separation of the light-strips used to generate was 10 pixels, thus radians per pixel. Decreasing the rotation-step and increasing the camera z-resolution, one may obtain very-high fringe-densities per square millimeter. Having high-density fringes one obtains very-high signal-to-noise ratios for the demodulated-phase and an accurate 360° digital rendering of the solid.
As mentioned, the first step in our technique is to collect N individual images of the light-strip as imaged over the CCD camera. In Fig. 5 we show several phase-modulated light-strips as imaged over the CCD camera-plane with the ambient-light turned-off.
Figure 6 shows the carrier-frequency fringe-patterns (see Eq. (3)) synthesized by the displaced sum of 500 light-strip images; that is 100 more light-strip images than needed for recording one full-revolution. These additional light-strip images allow us to cope with distortion close to the left and right boundaries of the estimated phase-map. As a consequence, after the modulating phase is estimated we need to keep only the “central” 360-degrees.
Figure 7 shows the spectrum of the fringe carrier signal for the Rubik’s cube. The spectrum corresponding to the plastic skull is not shown because it looks very similar.
The desired spectrum in Fig. 7 is red circled. Note that the distorting harmonics are well separated so they do not interfere with. The harmonic phase-distortion is minimized because the added light-strips look almost sinusoidal in Fig. 6.
Figure 8 shows the demodulated wrapped-phases . Fine surface details are visible because the wrapped phases look as topographic elevation-curves. Phase sensibility is about 6λ for the Rubik’s cube (that is, the modulated phase is wrapped six times, Fig. 8(a)) and 7λ for the human-skull (seven phase wrappings, Fig. 8(b)).
Figure 9 shows the unwrapped phases of the Rubik’s cube, the plastic skull, and the magnitude of the analytic signal of the skull as texture of the skull-phase.
Figure 10 shows three Cartesian-rendering perspectives of each one of the 3D digitized solids, with the amplitude of their fringes as texture to appreciate fine surface details. To obtain these Cartesian-coordinate solids, we have used the standard cylindrical-to-Cartesian coordinate transformation, and . As expected the fringe-contrast drops to zero at regions where self-occluding shadows are cast (for example, the eye-basins on the human skull model).
Finally, we compare our Fourier phase-demodulation against strip-light centroid estimation. For this comparison in both cases, we used the same light-strips data-images. The result is shown in Fig. 11 for three level-curves (at z1, z2 and z3) of the Rubik’s cube.
From Fig. 11 we see that the light-strip centroid estimation renders the 3D surface noisier than the Fourier phase-demodulation for the same light-strip data. This is expected because we can generate extremely high-density fringes which would easily surpass the light-strip’s signal-to-noise centroid-estimation. Moreover triangulation digitalization based on centroid estimation does not provide the analytic amplitude of the fringes, which is valuable for displaying and designing 3D computer models. In other words, the light-strip centroid estimation gives us only the object’s shape without texture.
6. Conclusions
We have presented a Fourier phase-demodulation approach for 360° profilometry using light-strip projection. This new approach combines the Fourier phase-demodulation method with standard light-strip 360° profilometry. The herein presented light-strip profilometer is capable of digitizing solids represented by single-valued, continuous and bounded surfaces. As any other light-strip profilometers, this 360° profilometer can also digitize discontinuous 3D surfaces minimizing self-occluding shadows.
To asses our proposed 360° profilometer we have chosen a Rubik’s cube and a plastic human-skull. These solids are challenging; The plastic skull is a far more difficult object to digitize due to its sharp discontinuities, deep shallow holes (i.e. the eye-basins) and many high-frequency surface details. As seen from the experiments, we are capable of accurately digitizing these two objects, except for the regions where self-generated shadows are cast.
The Fourier phase-demodulation technique is capable of digitizing much finer surface details than light-strip centroid estimation for the same raw light-strip data. Increasing the rotation-step resolution and the CCD camera z-resolution, one is capable to attain higher fringe densities and therefore higher phase-demodulation sensibility (Eqs. (4)-(5)). By increasing this high-density fringe-pattern we can automatically and effortlessly obtain much finer surface details. In contrast, the sensibility on light-strip centroid estimation does not increase with the strips density because the light-strip centroids are estimated independently.
Appendix A Experimental fringes generated by the 3 published 360° profilometers.
To make even clearer the difference between our herein proposed 360° profilometer with those previously published, here we briefly compare and analyze the fringe-pattern image generated by these 3 different approaches. Three experimental images formed at the CCD camera plane (Fig. 1) are presented in Fig. 12. Note that the input data is quite different for each one of these 3 experiments.
Figure 13 shows the fringe construction process followed by the three 360° techniques. All the fringe patterns in Fig. 13 are rotationally symmetric because the solid sphere is symmetric around a vertical line crossing its center. These three fringe-pattern constructions have the following characteristics:
- a) Fig. 13(a) shows the Halioua et al. [2] technique in which the fringe projector is above the CCD camera (see Fig. 1(a)). The advantage of this setup is that just one carrier fringe pattern is needed [2]. The disadvantage is that the lower part of the 3D solid cannot be recovered due to self-occluding shadows.
- b) Fig. 13(b) shows the 360° fringes obtained from the technique described by Servin et al. [15]. This technique has been used with good results. A video of it, including additional considerations not covered in this paper (such as co-phasing), was uploaded to YouTube (https://www.youtube.com/watch?v=lwx7U1ErlgM).
- c) Fig. 13(c) shows the experimental fringe-pattern generated by the herein proposed 360° profilometer. As can be seen, the fringe pattern generated by assembling N light-strips has phase-modulated carrier-frequency fringes.
As far as we know, strip-projection 360° profilometry coupled to Fourier phase-demodulation is a new contribution of this paper. We also want to remark that in the method described in [15] by the same authors (see Fig. 13(b)), the spatial linear-carrier is lost so we need 4 phase-shifted fringe-patterns (4-full revolution rotations) to demodulate the modulating phase. Here, only one revolution is required to produce the open-fringes pattern in Fig. 6 or Fig. 13(c).
Acknowledgments
The authors would like to acknowledge the financial support by the Mexican National Council for Science and Technology (CONACYT), grant 157044.
References and links
1. M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. 72(1), 156–160 (1982). [CrossRef]
2. M. Halioua, R. S. Krishnamurthy, H. C. Liu, and F. P. Chiang, “Automated 360 ° profilometry of 3-D diffuse objects,” Appl. Opt. 24(14), 2193–2196 (1985). [CrossRef] [PubMed]
3. X. X. Cheng, X. Y. Su, and L. R. Guo, “Automated measurement method for 360 ° profilometry of 3-D diffuse objects,” Appl. Opt. 30(10), 1274–1278 (1991). [CrossRef] [PubMed]
4. A. Asundi, C. S. Chan, and M. R. Sajan, “360 deg profilometry: new techniques for display and acquisition,” Opt. Eng. 33(8), 2760–2769 (1994). [CrossRef]
5. M. Chang and W. C. Tai, “360-deg profile noncontact measurement using a neural network,” Opt. Eng. 34(12), 3572–3576 (1995). [CrossRef]
6. A. S. Gomes, L. A. Serra, A. S. Lage, and A. Gomes, “Automated 360 degree profilometry of human trunk for spinal deformity analysis,” in Proceedings of Three Dimensional Analysis of Spinal Deformities, M. D’Amico, A. Merolli, and G. C. Santambrogio, eds. (IOS, 1995), pp. 423–429.
7. Y. Song, H. Zhao, W. Chen, and Y. Tan, “360 degree 3D profilometry,” Proc. SPIE 3204, 204–208 (1997). [CrossRef]
8. A. Asundi and W. Zhou, “Mapping algorithm for 360-deg profilometry with time delayed integration imaging,” Opt. Eng. 38(2), 339–344 (1999). [CrossRef]
9. X. Su and W. Chen, “Fourier transform profilometry: a review,” Opt. Lasers Eng. 35(5), 263–284 (2001). [CrossRef]
10. X. Zhang, P. Sun, and H. Wang, “A new 360 rotation profilometry and its application in engine design,” Proc. SPIE 4537, 265–268 (2002). [CrossRef]
11. J. A. Muñoz-Rodríguez, A. Asundi, and R. Rodriguez-Vera, “Recognition of a light line pattern by Hu moments for 3-D reconstruction of a rotated object,” Opt. Laser Technol. 37(2), 131–138 (2005). [CrossRef]
12. G. Trujillo-Schiaffino, N. Portillo-Amavisca, D. P. Salas-Peimbert, L. Molina-de la Rosa, S. Almazan-Cuellar, and L. F. Corral-Martinez, “Three-dimensional profilometry of solid objects in rotation,” AIP Proc. 992, 924–928 (2008).
13. B. Shi, B. Zhang, F. Liu, J. Luo, and J. Bai, “360° Fourier transform profilometry in surface reconstruction for fluorescence molecular tomography,” IEEE J. Biomed. Health Inform. 17(3), 681–689 (2013). [CrossRef] [PubMed]
14. Y. Zhang and G. Bu, “Automatic 360-deg profilometry of a 3D object using a shearing interferometer and virtual grating,” Proc. SPIE 2899, 162–169 (1996). [CrossRef]
15. M. Servin, G. Garnica, J. C. Estrada, and J. M. Padilla, “High-resolution low-noise 360-degree digital solid reconstruction using phase-stepping profilometry,” Opt. Express 22(9), 10914–10922 (2014). [CrossRef] [PubMed]
16. Y. Long, S. Wang, W. Wu, X. Yang, G. Jeon, and K. Liu, “Decoding line structured light patterns by using Fourier analysis,” Opt. Eng. 54(7), 073109 (2015). [CrossRef]
17. V. Bianco, M. Paturzo, and P. Ferraro, “Spatio-temporal scanning modality for synthesizing interferograms and digital holograms,” Opt. Express 22(19), 22328–22339 (2014). [CrossRef] [PubMed]
18. V. Bianco, M. Paturzo, V. Marchesano, I. Gallotta, E. Di Schiavi, and P. Ferraro, “Optofluidic holographic microscopy with custom field of view (FoV) using a linear array detector,” Lab Chip 15(9), 2117–2124 (2015). [CrossRef] [PubMed]
19. M. Servin, M. Padilla, and G. Garnica, “Fourier phase-demodulation applied to light-strip 360-degrees profilometry of 3D solids; theoretical principles,” http://arxiv.org/abs/1510.04587.
20. M. Servin, J. A. Quiroga, and J. M. Padilla, Fringe Pattern Analysis for Optical Metrology: Theory, Algorithms and Applications (Wiley-VCH, 2014).