Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

360-degrees profilometry using strip-light projection coupled to Fourier phase-demodulation

Open Access Open Access

Abstract

360 degrees (360°) digitalization of three dimensional (3D) solids using a projected light-strip is a well-established technique in academic and commercial profilometers. These profilometers project a light-strip over the digitizing solid while the solid is rotated a full revolution or 360-degrees. Then, a computer program typically extracts the centroid of this light-strip, and by triangulation one obtains the shape of the solid. Here instead of using intensity-based light-strip centroid estimation, we propose to use Fourier phase-demodulation for 360° solid digitalization. The advantage of Fourier demodulation over strip-centroid estimation is that the accuracy of phase-demodulation linearly-increases with the fringe density, while in strip-light the centroid-estimation errors are independent. Here we proposed first to construct a carrier-frequency fringe-pattern by closely adding the individual light-strip images recorded while the solid is being rotated. Next, this high-density fringe-pattern is phase-demodulated using the standard Fourier technique. To test the feasibility of this Fourier demodulation approach, we have digitized two solids with increasing topographic complexity: a Rubik's cube and a plastic model of a human-skull. According to our results, phase demodulation based on the Fourier technique is less noisy than triangulation based on centroid light-strip estimation. Moreover, Fourier demodulation also provides the amplitude of the analytic signal which is a valuable information for the visualization of surface details.

© 2016 Optical Society of America

1. Introduction

Fringe projection profilometry of three-dimensional (3D) solids using Fourier phase-demodulation is well known since the classical paper by Takeda et al. in 1982 [1]. Although this phase-measuring technique is effective for 3D digitalization, it cannot digitize the 360 degrees (360°) view of the solid. In order to do this, one needs to position the solid over a turntable to have access it from all 360° perspectives. As far as we know, 360° profilometer was first published in 1985 by Halioua et al. [2], who used a linear fringe-pattern projection coupled with 3-step phase shifter and a turntable to obtain the 360° profilometry of a human mannequin head [2]. Six years later, in 1991, Cheng et al. [3] reported a 360° digitalization approach using an artificial-neural network projecting a laser-strip over the solid lying over a 360° rotary-stage. In 1994 Asundi et al. [4] published a fast 360° technique based on a stroboscope light-strip projection and a digital drum camera rotating the solid at 2500 rpm. Also Chang et al. [5] used a light-strip from a laser diode and automatically reconstructed a solid with 360° using a neural network to estimate the centroid of the light-strips. Gomes et al. [6] used a projected full-image linear-grating to study the human trunk looking for spinal deformities; they used Fourier profilometry for their purpose since light-strip 360° profilometers are less accurate than Fourier phase demodulation of high-density carrier fringe-patterns [1–3,6]. Later on Song et al. [7] used a full-image linear-fringes and phase-shifting interferometry of a rotating object for 360° profilometry. Asundi et al. [8] also implemented a very fast 360° profilometer using a time delay integration imaging for solid digitalization. The state of the art on general 3D profilometry was reviewed in 2001 by Sue and Chen [9], but they have just included a single paper of 360° profilometry. One year later Zhang et al. [10] used 360° profilometry for flow analysis in mechanical engines. In 2005 Munoz-Rodriguez et al. used triangulation for 3D object reconstruction by projecting a light-strip and Hu moments [11]. In 2008, Trujillo-Shiaffino et al. [12] used 3D profilometry based on light-strip projection and centroid triangulation to digitize a smooth rotation-symmetric object. In 2013 Shi et al. used 360° fringe projection profilometry applied to fluorescent molecular tomography [13]. Some researchers have used shearing interferometry to project high quality linear fringes for 360° profilometry with good results [14]. In 2015, Long et al. [16] used light-strips to synthesize a carrier fringe-pattern image for (non 360°) 3D profilometry. Bianco et al. [17,18] also used a carrier-frequency fringe-pattern synthesis similar to that herein used, but applied to optical interferometric microscopy instead of 360° profilometry. Finally, we just posted the theoretical principles (without experiments) of the herein proposed 360° technique in the arXiv repository [19]. This is a brief, self-contained, view of the main ideas behind 360° profilometry to this date. The relation of our proposal with previous 360° profilometers will be discussed in a more detailed way in Appendix A at the end of this paper.

We have tested our light-strip Fourier phase-demodulation technique by digitizing two more complex solids than previously reported [2–15]. We have digitized a Rubik’s cube and a plastic human-skull. The plastic skull is more difficult because it is a discontinuous non-convex 3D-surface with deep holes (the eye basins) and high-frequency surface details. By digitizing these two solids we demonstrate that our 360°, Fourier phase-demodulation algorithm is more robust, less noisy and reveal more surface details than previous 360° profilometers [2–14]. A Rubik’s cube was previously digitized using 4-step phase-shifting 360° profilometry by the same authors [15], obtaining about the same results as shown here but requiring 4 solid full-revolutions (4×360°). In other words, our previously reported technique generates fringe patterns without carrier and needs four times more data and time to obtain about the same experimental results [15]. Also note that we are not discussing camera-projector calibration issues or using non-standard phase-demodulating and/or unwrapping algorithms because we do not require any new strategies on these topics [2–15]. Finally we want to remark that previous 360° profilometers (except Ref [15].) have mainly concentrated in digitizing very smooth, quasi-cylindrical objects [2–14]. This is because smooth, quasi-cylindrical 3D surfaces have fewer details and are far easier to digitize.

1.1 Well-known fringe and light-strip projection 360° profilometers

Here we review the 360° degrees profilometers for linear-fringe and light-strip projection reported to this date [2–15]. Figure 1 shows the three configurations for light projection over a solid-sphere lying over a 360° rotary-stage. For clarity we picture a sphere but our 360° technique applies to far more complicated solids.

 figure: Fig. 1

Fig. 1 The 3 experimental set-ups for single-camera 360° profilometry previously reported [2–15]. The shadows cast by the light-projectors are in blue. Panels (a) and (b) use fringe-pattern projection, while panel (c) uses light-strip projection. The viewing point in this figure is not the camera’s perspective.

Download Full Size | PDF

In Fig. 1(a), a fringe projector in horizontal configuration is aimed towards the solid having sensitivity proportional to tan(θ0) [2–4,8]. Note that the lower part of the object remains occluded during the 360° revolution; In our opinion, this is a significant drawback. In Fig. 1(b), the projected fringes are vertical and the digitizing camera is located on the (x,y) plane with an azimuthal angle-sensitivityφ0. The setup in Fig. 1(b) allows projecting open fringes over the entire object. Roughly speaking there were two alternatives for 360° digitizing 3D solids: full-image Fourier phase-demodulation from several azimuthal angles [11–13], or single vertical pixel-column using phase-stepping techniques as in [15]. When using the Fourier method, a full-frame open-fringes pattern was captured and just the vertical pixel-column for each angle is kept (Fig. 1(a)); afterwards these pixel columns were stitched together to construct a fringe-image of the object in cylindrical coordinates [11–13]. In reference [15], we demonstrated a lower complexity alternative where one needs to record just the central camera-pixel-column of the fringe-pattern for each azimuthal step (Fig. 1(b)). However, this approach generates non-carrier fringe-images [15], so it requires phase-stepping techniques. Therefore the digitizing solid has to be rotated 4 full-revolutions [15]. The experimental setup shown in Fig. 1(c) is the one used in this paper for our 360° profilometer and it will be discussed next.

2. Theoretical background

The light-strip (in panel 1(c)) as imaged over the CCD camera plane (x,z)2 may be modeled by the following Gaussian irradiance:

G(y,z;φ)={a1(y,z;φ)+b1(x,y;φ)e[yρ(φ,z)sinθ0]22σ2}(y);ρ[0,R],φ[0,2π),
where ρ(φ,z) is the solid being digitized in cylindrical coordinates. The functiona1(y,z;φ)is the ambient-light which is normally turned-off a1(y,z;φ)=0; the functionb1(y,z;φ) is the strip-light intensity. The light-strip is phase-modulated by ρ(φ,z)sinθ0 for each rotation-stepΔφ [19]. The window indicator function (y) may be expressed as (see Fig. 2),
(y)={1ify[0,max[y(φ,z)]]0otherwise;φ[0,2π],z[L.L].
having N light-strip images separated by rotation-stepsΔφ, they are summed to synthesize the following fringe-pattern as (see Fig. 2),
I(φ,z)=n=0N1G[y(φnΔφ,z),z];Δφ=2πN.
where we have used y(φ,z)=ρ(φ,z)sin(φ0) to pass from the coordinate system (y,z;φ) to (φ,z)in Eq. (3). As Fig. 2 shows, this sum of displaced Gaussian-strips forms a synthesized carrier-frequency fringe-pattern that can be modeled approximately as,
I(φ,z)a(φ,z)+b(φ,z)cos[ω0φ+gρ(φ,z)];g=ω0sin(φ0),
where a(φ,z) and b(φ,z) are the background and contrast functions of the carrier-fringes respectively. The modulated phase gρ(φ,z) is proportional to the radius of the objectρ(φ,z) and the linear-carrier frequencyω0,
ω0=2π(Numberofstripsintheφdirection)Numberofpixelsintheφdirection(radianspixels).
while ω0 depends on the Gaussian-strips separation in I(φ,z) (see Fig. 2). We have found experimentally that optimal Gaussian separation reduces the higher harmonics of I(φ,z).

 figure: Fig. 2

Fig. 2 This figure schematically shows how a carrier-frequency fringe-pattern is constructed or synthesized from N individual Gaussian light-strip images bounded by(y).

Download Full Size | PDF

The searched phase gρ(φ,z) in Eq. (4) can be demodulation using the Fourier technique [1]. Rewriting I(φ,z)in Eq. (4) using the exponential notation one obtains,

I(φ,z)=a(φ,z)+b(φ,z)2ei[ω0φ+gρ(φ,z)]+b(φ,z)2ei[ω0φ+gρ(φ,z)].
taking the Fourier transform of this signal, one obtains the following spectrum:
F[I(φ,z)]=A(ωφ,ωz)+C(ωφω0,ωz)+C(ωφ+ω0,ωz).
where (ωφ,ωz)[π,π]×[π,π] is the Fourier spectral space of the image gρ(φ,z). The parameter ω0 is chosen to have enough separation among the spectral lobes in Eq. (7). After displacing C(ωφ,ωz) to the origin, and using the inverse Fourier transform one obtains [20],
F1[C(ωφ,ωz)]=b(φ,z)2eigρ(φ,z).
the searched wrapped-phase can be computed as the angle of this analytic signal:
[gρ(φ,z)]mod2π=angle{b(φ,z)2eigρ(φ,z)}.
The recovered amplitude b(φ,z) can be used to generate a binary function Mask(φ,z) to exclude the self-occluding shadows and/or regions of poor fringe contrast as,
Mask(φ,z)={1if|b(φ,z)|ε0Otherwise.
Finally, we want to remark that any light-strip 360-degrees profilometer has a well-known limitation when discontinuous solids are digitized. Figure 3 shows schematically this limitation for a dented solid where the blue zones represent self-generated discontinuities shadows cast by the 3D surface. At these shadows the light-strip amplitude drops to zero and the solid ρ=ρ(φ,z) is undefined. The only way of having no shadows would be that the camera and the line-projector would share the same optical axis (see Fig. 3), dropping the profilometer’ sensitivity to zero i.e. φ0=0.

 figure: Fig. 3

Fig. 3 Cross-section at z0 of a dented solid ρ = ρ(φ,z0) having 6 discontinuities. At the specific rotation angle φ shown, the light-strip projector cast a self-occluding shadow (in blue) over the discontinuity as seen from the CCD camera. The blue-zones in the estimated signal ρ = ρ(φ,z0) denote zero intensity (shadows) where the recovered object is undefined.

Download Full Size | PDF

3. Experimental results

To test our proposed light-strip Fourier demodulation 360° profilometer, we have digitized two solids with increasing topographic complexity: a Rubik's cube and a plastic model of a human-skull (see Fig. 4). Our experimental setup is described in Fig. 1(c). The azimuthal rotation was controlled with 400 steps per revolution, soΔφ=0.9º; as light-strip projector we used a 1024×768 digital light projector, and the CCD camera has VGA resolution (640×480). The separation of the light-strips used to generate I(φ,z) was 10 pixels, thus ω0=2π/10 radians per pixel. Decreasing the rotation-step and increasing the camera z-resolution, one may obtain very-high fringe-densities per square millimeter. Having high-density fringes one obtains very-high signal-to-noise ratios for the demodulated-phase and an accurate 360° digital rendering of the solidρ=ρ(φ,z).

 figure: Fig. 4

Fig. 4 Photographs of (a) the Rubik’s cube, and (b) a human-skull model used in our experiment. In both cases the ambient light is turned-on to see the object; however this ambient light is normally turned-off. If the ambient light cannot be turned-off, the energy of the background signal would increase and the fringes’ contrast would decrease, resulting in noisier phase measurements [20].

Download Full Size | PDF

As mentioned, the first step in our technique is to collect N individual images of the light-strip as imaged over the CCD camera. In Fig. 5 we show several phase-modulated light-strips as imaged over the CCD camera-plane with the ambient-light turned-off.

 figure: Fig. 5

Fig. 5 Digitized phase-modulated strips of the 2 solids under study. These come from 5 consecutive azimuthal rotation angles. Panel (a) shows light-strips over the Rubik’s cube. Panel (b) shows the light-strips over the plastic skull.

Download Full Size | PDF

Figure 6 shows the carrier-frequency fringe-patterns (see Eq. (3)) synthesized by the displaced sum of 500 light-strip images; that is 100 more light-strip images than needed for recording one full-revolution. These additional light-strip images allow us to cope with distortion close to the left and right boundaries of the estimated phase-map. As a consequence, after the modulating phase is estimated we need to keep only the “central” 360-degrees.

 figure: Fig. 6

Fig. 6 Carrier-frequency fringe-patterns (Eq. (3)) synthesized by adding individual light-strips for each rotation-step (see Fig. 5). Note that we have rotated the object more than a full-revolution; this excess rotation is useful to be far away from the left and right boundaries where the light-strip is bent by the 3D surface, and the Fourier demodulation does not work properly.

Download Full Size | PDF

Figure 7 shows the spectrum of the fringe carrier signal F[I(φ,z)] for the Rubik’s cube. The spectrum corresponding to the plastic skull is not shown because it looks very similar.

 figure: Fig. 7

Fig. 7 Frequency spectrum of the digitally constructed carrier-frequency fringes of the Rubik’s cube. The circle in red is the radius of the pass-band filter used to obtain the desired analytic signal. The spectral harmonics are due to the use of Gaussian-strip profiles instead of a sinusoidal profile.

Download Full Size | PDF

The desired spectrum C(ωφω0,ωz)in Fig. 7 is red circled. Note that the distorting harmonics are well separated so they do not interfere withC(ωφω0,ωz). The harmonic phase-distortion is minimized because the added light-strips look almost sinusoidal in Fig. 6.

Figure 8 shows the demodulated wrapped-phases [gρ(φ,z)]mod2π. Fine surface details are visible because the wrapped phases look as topographic elevation-curves. Phase sensibility is about 6λ for the Rubik’s cube (that is, the modulated phase is wrapped six times, Fig. 8(a)) and 7λ for the human-skull (seven phase wrappings, Fig. 8(b)).

 figure: Fig. 8

Fig. 8 Wrapped-phases of (a) the Rubik’s cube, and (b) the plastic model of the human-skull. These phases were obtained after filtering the analytic-signal (Fig. 7) and masking-out self-occluding shadow regions. Notice that the modulating phase in the Rubik’s cube is wrapped 6 times, while the skull phase is wrapped 7 times.

Download Full Size | PDF

Figure 9 shows the unwrapped phases of the Rubik’s cube, the plastic skull, and the magnitude of the analytic signal of the skull as texture of the skull-phase.

 figure: Fig. 9

Fig. 9 Unwrapped phasesgρ(φ,z) in cylindrical coordinates masking-out low-amplitude fringe regions. From top to bottom we show: the cube with its analytic magnitude as texture, the skull phase without texture, and the skull phase with its analytic magnitude as texture.

Download Full Size | PDF

Figure 10 shows three Cartesian-rendering perspectives of each one of the 3D digitized solids, with the amplitude of their fringes as texture to appreciate fine surface details. To obtain these Cartesian-coordinate solids, we have used the standard cylindrical-to-Cartesian coordinate transformationx=ρcos(φ), y=ρsin(φ)and z=z. As expected the fringe-contrast drops to zero at regions where self-occluding shadows are cast (for example, the eye-basins on the human skull model).

 figure: Fig. 10

Fig. 10 Three-dimensional Cartesian renderings using the analytic magnitude as texture. At top, three perspectives of the Rubik’s cube. At bottom, the plastic skull showing self-occluding shadows inside the eye-basins where the light-strip is occluded from the camera view.

Download Full Size | PDF

Finally, we compare our Fourier phase-demodulation against strip-light centroid estimation. For this comparison in both cases, we used the same light-strips data-images. The result is shown in Fig. 11 for three level-curves (at z1, z2 and z3) of the Rubik’s cube.

 figure: Fig. 11

Fig. 11 Comparison of the Fourier phase-demodulation against strip-light centroid-estimation for the Rubik’s cube digitalization. Panel (a) shows the Rubik’s cube with three z-cuts at z1, z2 and z3. These cuts are color-coded as level-curves for (b) line-centroid estimation, and (c) Fourier phase-demodulation. It is clear that we have a less noise using Fourier phase-demodulation.

Download Full Size | PDF

From Fig. 11 we see that the light-strip centroid estimation renders the 3D surface noisier than the Fourier phase-demodulation for the same light-strip data. This is expected because we can generate extremely high-density fringes which would easily surpass the light-strip’s signal-to-noise centroid-estimation. Moreover triangulation digitalization based on centroid estimation does not provide the analytic amplitude of the fringes, which is valuable for displaying and designing 3D computer models. In other words, the light-strip centroid estimation gives us only the object’s shape without texture.

6. Conclusions

We have presented a Fourier phase-demodulation approach for 360° profilometry using light-strip projection. This new approach combines the Fourier phase-demodulation method with standard light-strip 360° profilometry. The herein presented light-strip profilometer is capable of digitizing solids represented by single-valued, continuous and bounded surfacesρ=ρ(φ,z). As any other light-strip profilometers, this 360° profilometer can also digitize discontinuous 3D surfaces minimizing self-occluding shadows.

To asses our proposed 360° profilometer we have chosen a Rubik’s cube and a plastic human-skull. These solids are challenging; The plastic skull is a far more difficult object to digitize due to its sharp discontinuities, deep shallow holes (i.e. the eye-basins) and many high-frequency surface details. As seen from the experiments, we are capable of accurately digitizing these two objects, except for the regions where self-generated shadows are cast.

The Fourier phase-demodulation technique is capable of digitizing much finer surface details than light-strip centroid estimation for the same raw light-strip data. Increasing the rotation-step resolution and the CCD camera z-resolution, one is capable to attain higher fringe densities and therefore higher phase-demodulation sensibility (Eqs. (4)-(5)). By increasing this high-density fringe-pattern we can automatically and effortlessly obtain much finer surface details. In contrast, the sensibility on light-strip centroid estimation does not increase with the strips density because the light-strip centroids are estimated independently.

Appendix A Experimental fringes generated by the 3 published 360° profilometers.

To make even clearer the difference between our herein proposed 360° profilometer with those previously published, here we briefly compare and analyze the fringe-pattern image generated by these 3 different approaches. Three experimental images formed at the CCD camera plane (Fig. 1) are presented in Fig. 12. Note that the input data is quite different for each one of these 3 experiments.

 figure: Fig. 12

Fig. 12 A solid-sphere as imaged over the CCD camera using the 360° setups in Fig. 1. In panel 12(a), the fringe projector is located above the sphere as in Fig. 1(a) and we keep just the central column-pixels (marked in black). Panel 12(b) shows one out of 360 images from which we keep just the central column-pixel (see [15] and Fig. 1(b)). In panel 12(c), the strip projection as Fig. 1(c) shows; 360 of these small frames (inside the black rectangle) are needed. The ambient-light is turned-on to show the digitizing sphere.

Download Full Size | PDF

Figure 13 shows the fringe construction process followed by the three 360° techniques. All the fringe patterns in Fig. 13 are rotationally symmetric because the solid sphere is symmetric around a vertical line crossing its center. These three fringe-pattern constructions have the following characteristics:

 figure: Fig. 13

Fig. 13 Panel (a) shows the carrier-fringes constructed for the first time in [2]; the lower part of the sphere is occluded by self-shadow [2]. (b) 360° fringe pattern (without carrier) constructed following Servin et al. [15]; please note that we have no carrier, so one must collect 4 phase-shifted fringes requiring 4 solid’s revolutions [15]. Panel (c) shows 360° fringe-pattern construction using our herein proposed technique which requires a single fringe-pattern image.

Download Full Size | PDF

  • a) Fig. 13(a) shows the Halioua et al. [2] technique in which the fringe projector is above the CCD camera (see Fig. 1(a)). The advantage of this setup is that just one carrier fringe pattern is needed [2]. The disadvantage is that the lower part of the 3D solid cannot be recovered due to self-occluding shadows.
  • b) Fig. 13(b) shows the 360° fringes obtained from the technique described by Servin et al. [15]. This technique has been used with good results. A video of it, including additional considerations not covered in this paper (such as co-phasing), was uploaded to YouTube (https://www.youtube.com/watch?v=lwx7U1ErlgM).
  • c) Fig. 13(c) shows the experimental fringe-pattern generated by the herein proposed 360° profilometer. As can be seen, the fringe pattern generated by assembling N light-strips has phase-modulated carrier-frequency fringes.

As far as we know, strip-projection 360° profilometry coupled to Fourier phase-demodulation is a new contribution of this paper. We also want to remark that in the method described in [15] by the same authors (see Fig. 13(b)), the spatial linear-carrier is lost so we need 4 phase-shifted fringe-patterns (4-full revolution rotations) to demodulate the modulating phase. Here, only one revolution is required to produce the open-fringes patternI(φ,z) in Fig. 6 or Fig. 13(c).

Acknowledgments

The authors would like to acknowledge the financial support by the Mexican National Council for Science and Technology (CONACYT), grant 157044.

References and links

1. M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. 72(1), 156–160 (1982). [CrossRef]  

2. M. Halioua, R. S. Krishnamurthy, H. C. Liu, and F. P. Chiang, “Automated 360 ° profilometry of 3-D diffuse objects,” Appl. Opt. 24(14), 2193–2196 (1985). [CrossRef]   [PubMed]  

3. X. X. Cheng, X. Y. Su, and L. R. Guo, “Automated measurement method for 360 ° profilometry of 3-D diffuse objects,” Appl. Opt. 30(10), 1274–1278 (1991). [CrossRef]   [PubMed]  

4. A. Asundi, C. S. Chan, and M. R. Sajan, “360 deg profilometry: new techniques for display and acquisition,” Opt. Eng. 33(8), 2760–2769 (1994). [CrossRef]  

5. M. Chang and W. C. Tai, “360-deg profile noncontact measurement using a neural network,” Opt. Eng. 34(12), 3572–3576 (1995). [CrossRef]  

6. A. S. Gomes, L. A. Serra, A. S. Lage, and A. Gomes, “Automated 360 degree profilometry of human trunk for spinal deformity analysis,” in Proceedings of Three Dimensional Analysis of Spinal Deformities, M. D’Amico, A. Merolli, and G. C. Santambrogio, eds. (IOS, 1995), pp. 423–429.

7. Y. Song, H. Zhao, W. Chen, and Y. Tan, “360 degree 3D profilometry,” Proc. SPIE 3204, 204–208 (1997). [CrossRef]  

8. A. Asundi and W. Zhou, “Mapping algorithm for 360-deg profilometry with time delayed integration imaging,” Opt. Eng. 38(2), 339–344 (1999). [CrossRef]  

9. X. Su and W. Chen, “Fourier transform profilometry: a review,” Opt. Lasers Eng. 35(5), 263–284 (2001). [CrossRef]  

10. X. Zhang, P. Sun, and H. Wang, “A new 360 rotation profilometry and its application in engine design,” Proc. SPIE 4537, 265–268 (2002). [CrossRef]  

11. J. A. Muñoz-Rodríguez, A. Asundi, and R. Rodriguez-Vera, “Recognition of a light line pattern by Hu moments for 3-D reconstruction of a rotated object,” Opt. Laser Technol. 37(2), 131–138 (2005). [CrossRef]  

12. G. Trujillo-Schiaffino, N. Portillo-Amavisca, D. P. Salas-Peimbert, L. Molina-de la Rosa, S. Almazan-Cuellar, and L. F. Corral-Martinez, “Three-dimensional profilometry of solid objects in rotation,” AIP Proc. 992, 924–928 (2008).

13. B. Shi, B. Zhang, F. Liu, J. Luo, and J. Bai, “360° Fourier transform profilometry in surface reconstruction for fluorescence molecular tomography,” IEEE J. Biomed. Health Inform. 17(3), 681–689 (2013). [CrossRef]   [PubMed]  

14. Y. Zhang and G. Bu, “Automatic 360-deg profilometry of a 3D object using a shearing interferometer and virtual grating,” Proc. SPIE 2899, 162–169 (1996). [CrossRef]  

15. M. Servin, G. Garnica, J. C. Estrada, and J. M. Padilla, “High-resolution low-noise 360-degree digital solid reconstruction using phase-stepping profilometry,” Opt. Express 22(9), 10914–10922 (2014). [CrossRef]   [PubMed]  

16. Y. Long, S. Wang, W. Wu, X. Yang, G. Jeon, and K. Liu, “Decoding line structured light patterns by using Fourier analysis,” Opt. Eng. 54(7), 073109 (2015). [CrossRef]  

17. V. Bianco, M. Paturzo, and P. Ferraro, “Spatio-temporal scanning modality for synthesizing interferograms and digital holograms,” Opt. Express 22(19), 22328–22339 (2014). [CrossRef]   [PubMed]  

18. V. Bianco, M. Paturzo, V. Marchesano, I. Gallotta, E. Di Schiavi, and P. Ferraro, “Optofluidic holographic microscopy with custom field of view (FoV) using a linear array detector,” Lab Chip 15(9), 2117–2124 (2015). [CrossRef]   [PubMed]  

19. M. Servin, M. Padilla, and G. Garnica, “Fourier phase-demodulation applied to light-strip 360-degrees profilometry of 3D solids; theoretical principles,” http://arxiv.org/abs/1510.04587.

20. M. Servin, J. A. Quiroga, and J. M. Padilla, Fringe Pattern Analysis for Optical Metrology: Theory, Algorithms and Applications (Wiley-VCH, 2014).

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (13)

Fig. 1
Fig. 1 The 3 experimental set-ups for single-camera 360° profilometry previously reported [2–15]. The shadows cast by the light-projectors are in blue. Panels (a) and (b) use fringe-pattern projection, while panel (c) uses light-strip projection. The viewing point in this figure is not the camera’s perspective.
Fig. 2
Fig. 2 This figure schematically shows how a carrier-frequency fringe-pattern is constructed or synthesized from N individual Gaussian light-strip images bounded by (y) .
Fig. 3
Fig. 3 Cross-section at z0 of a dented solid ρ = ρ(φ,z0) having 6 discontinuities. At the specific rotation angle φ shown, the light-strip projector cast a self-occluding shadow (in blue) over the discontinuity as seen from the CCD camera. The blue-zones in the estimated signal ρ = ρ(φ,z0) denote zero intensity (shadows) where the recovered object is undefined.
Fig. 4
Fig. 4 Photographs of (a) the Rubik’s cube, and (b) a human-skull model used in our experiment. In both cases the ambient light is turned-on to see the object; however this ambient light is normally turned-off. If the ambient light cannot be turned-off, the energy of the background signal would increase and the fringes’ contrast would decrease, resulting in noisier phase measurements [20].
Fig. 5
Fig. 5 Digitized phase-modulated strips of the 2 solids under study. These come from 5 consecutive azimuthal rotation angles. Panel (a) shows light-strips over the Rubik’s cube. Panel (b) shows the light-strips over the plastic skull.
Fig. 6
Fig. 6 Carrier-frequency fringe-patterns (Eq. (3)) synthesized by adding individual light-strips for each rotation-step (see Fig. 5). Note that we have rotated the object more than a full-revolution; this excess rotation is useful to be far away from the left and right boundaries where the light-strip is bent by the 3D surface, and the Fourier demodulation does not work properly.
Fig. 7
Fig. 7 Frequency spectrum of the digitally constructed carrier-frequency fringes of the Rubik’s cube. The circle in red is the radius of the pass-band filter used to obtain the desired analytic signal. The spectral harmonics are due to the use of Gaussian-strip profiles instead of a sinusoidal profile.
Fig. 8
Fig. 8 Wrapped-phases of (a) the Rubik’s cube, and (b) the plastic model of the human-skull. These phases were obtained after filtering the analytic-signal (Fig. 7) and masking-out self-occluding shadow regions. Notice that the modulating phase in the Rubik’s cube is wrapped 6 times, while the skull phase is wrapped 7 times.
Fig. 9
Fig. 9 Unwrapped phases gρ(φ,z) in cylindrical coordinates masking-out low-amplitude fringe regions. From top to bottom we show: the cube with its analytic magnitude as texture, the skull phase without texture, and the skull phase with its analytic magnitude as texture.
Fig. 10
Fig. 10 Three-dimensional Cartesian renderings using the analytic magnitude as texture. At top, three perspectives of the Rubik’s cube. At bottom, the plastic skull showing self-occluding shadows inside the eye-basins where the light-strip is occluded from the camera view.
Fig. 11
Fig. 11 Comparison of the Fourier phase-demodulation against strip-light centroid-estimation for the Rubik’s cube digitalization. Panel (a) shows the Rubik’s cube with three z-cuts at z1, z2 and z3. These cuts are color-coded as level-curves for (b) line-centroid estimation, and (c) Fourier phase-demodulation. It is clear that we have a less noise using Fourier phase-demodulation.
Fig. 12
Fig. 12 A solid-sphere as imaged over the CCD camera using the 360° setups in Fig. 1. In panel 12(a), the fringe projector is located above the sphere as in Fig. 1(a) and we keep just the central column-pixels (marked in black). Panel 12(b) shows one out of 360 images from which we keep just the central column-pixel (see [15] and Fig. 1(b)). In panel 12(c), the strip projection as Fig. 1(c) shows; 360 of these small frames (inside the black rectangle) are needed. The ambient-light is turned-on to show the digitizing sphere.
Fig. 13
Fig. 13 Panel (a) shows the carrier-fringes constructed for the first time in [2]; the lower part of the sphere is occluded by self-shadow [2]. (b) 360° fringe pattern (without carrier) constructed following Servin et al. [15]; please note that we have no carrier, so one must collect 4 phase-shifted fringes requiring 4 solid’s revolutions [15]. Panel (c) shows 360° fringe-pattern construction using our herein proposed technique which requires a single fringe-pattern image.

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

G(y,z;φ)={ a1(y,z;φ)+b1(x,y;φ) e [ yρ(φ,z)sin θ 0 ] 2 2 σ 2 } (y) ;ρ[0,R],φ[0,2π),
(y)={ 1ify[ 0,max[y(φ,z)] ] 0otherwise ;φ[0,2π],z[L.L].
I(φ,z)= n=0 N1 G[ y(φnΔφ,z),z ] ;Δφ= 2π N .
I(φ,z)a(φ,z)+b(φ,z)cos[ ω 0 φ+gρ(φ,z) ];g= ω 0 sin( φ 0 ),
ω 0 = 2π(Numberofstripsintheφdirection) Numberofpixelsintheφdirection ( radians pixels ).
I(φ,z)=a(φ,z)+ b(φ,z) 2 e i[ ω 0 φ+gρ(φ,z) ] + b(φ,z) 2 e i[ ω 0 φ+gρ(φ,z) ] .
F[I(φ,z)]=A( ω φ , ω z )+C( ω φ ω 0 , ω z )+ C ( ω φ + ω 0 , ω z ).
F 1 [ C( ω φ , ω z ) ]= b(φ,z) 2 e igρ(φ,z) .
[ gρ(φ,z) ]mod2π=angle{ b(φ,z) 2 e igρ(φ,z) }.
Mask(φ,z)={ 1if| b(φ,z) |ε 0Otherwise .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.