Abstract

In this paper we describe a high-resolution, low-noise phase-shifting algorithm applied to 360 degree digitizing of solids with diffuse light scattering surface. A 360 degree profilometer needs to rotate the object a full revolution to digitize a three-dimensional (3D) solid. Although 360 degree profilometry is not new, we are proposing however a new experimental set-up which permits full phase-bandwidth phase-measuring algorithms. The first advantage of our solid profilometer is: it uses base-band, phase-stepping algorithms providing full data phase-bandwidth. This contrasts with band-pass, spatial-carrier Fourier profilometry which typically uses 1/3 of the fringe data-bandwidth. In addition phase-measuring is generally more accurate than single line-projection, non-coherent, intensity-based line detection algorithms. Second advantage: new fringe-projection set-up which avoids self-occluding fringe-shadows for convex solids. Previous 360 degree fringe-projection profilometers generate self-occluding shadows because of the elevation illumination angles. Third advantage: trivial line-by-line fringe-data assembling based on a single cylindrical coordinate system shared by all 360-degree perspectives. This contrasts with multi-view overlapping fringe-projection systems which use iterative closest point (ICP) algorithms to fusion the 3D-data cloud within a single coordinate system (e.g. Geomagic). Finally we used a 400 steps/rotation turntable, and a 640x480 pixels CCD camera. Higher 3D digitized surface resolutions and less-noisy phase measurements are trivial by increasing the angular-spatial resolution and phase-steps number without any substantial change on our 360 degree profilometer.

© 2014 Optical Society of America

1. Introduction

Fringe projection profilometry is a well known technique since the classical paper by Takeda et al. in 1982 [1]. Although this profilometry technique effectively demonstrated that 3D digitalization was possible using a single carrier fringe pattern, it cannot digitize the full 360 degree 3D object. The primary reason is that one needs to position the 3D object over a turntable to have access to every solid perspective from all (360 degree) directions. As far as we know the first researcher to implement an automated 360 degree profilometer was Halioua et al. in 1985 [2]. Halioua used a grating projector along with a 3-step phase shifter set-up and a turntable to obtain the 360 degree profilometry of a human mannequin head [2]. Later on in 1991 Cheng et al. have also positioned the 3D object in a turntable in order to rotate it 360° degree and obtain the full object information, and projected over the object a carrier frequency fringe-pattern [3]. Asundi published an interesting technique based on a striped light projection for 360 degree profilometry [4]. Using a light stripe from a laser diode, Chang et al. [5] automatically reconstructed a solid with 360 degrees using a neural network. Gomes et al. used projected linear grating to study the human trunk looking for spinal deformities; they used Fourier profilometry for their purpose [6]. Later on Song et al. used a fringe grating projector and phase shifting interferometry of a rotating object for 360° degree profilometry [7]. Asundi et al. used a time delay integration imaging for 360 degree acquisition of a rotating object [8] for 360 degree profilometry. The state of the art on 3D profilometry was reviewed at 2001 by Sue and Chen but they have only included a single paper of 360 degree profilometry [9]. Afterwards Zhang et al. have used 360 degree profilometry for flow analysis in mechanical engines [10]. In 2005 Munoz-Rodriguez et al. used triangulation for 3D object reconstruction by projecting a stripe light and Hu moments [11]. In 2008 Trujillo-Shiaffino et al. used 3D profilometry based on a single line projection and triangulation of a smooth rotating-symmetric object [12]. More recently Shi et al. used 360 degree fringe pattern projection profilometry applied to fluorescent molecular tomography [13]. Some researchers have used shearing interferometry to project high quality linear fringes for 360 degree profilometry [14]. In this paper we have not discussed calibration issues, because we have not used any new or non-standard calibration strategy apart from those well-known in 3D profilometry [15]. Also we have not used any new or non-standard phase unwrapping algorithm [14,15]. Given the low noise of the fringes and the noise-rejection capability of the 4-step least-squares phase-shifting demodulation, we have unwrapped our phase using simple line integration of wrapped phase differences.

From this review we see that 360 degree profilometry is already a mature research field and applications in industrial inspection and robotic vision are well known. But previous efforts [215] have mainly concentrated in digitizing very smooth, quasi-cylindrical objects. This is because smooth, almost cylindrical 3D objects are easier to digitize with previous 360 degree profilometry [215]. Here we have tested our proposed technique to digitize a Rubik’s cube all around it (360-degrees, almost 4π-steradians). The Rubik’s cube is a continuous 3D-solid with sharp corners and it is certainly not quasi-cylindrical. By digitizing a Rubik’s cube we demonstrate that our 360-degree digitizing technique is more robust and can digitize more complex 3D-surfaces than previous published approaches [215].

2. Previous 360-degree profilometers using fringe-grating or stripe-light projections

Here we review the two basic experimental set-ups that have been used for 360 degree profilometry (see Fig. 1) [215]. Figure 1(a) shows the standard configuration when linear fringes are projected over the 3D sphere shown. The fringe projector has a polar phase-sensitivity angle θ0with the (x,y) plane. On the other hand, the stripe-light projection set-up (Fig. 1(b)) has an azimuth phase-sensitivity angleφ0. In both cases the 3D object is rotated 360 degrees in the azimuthal direction to obtain 3D data from all possible perspectives.

 figure: Fig. 1

Fig. 1 Panel (a) shows the well known set-up when a linear grating projector is used. The linear grating is orthogonal to the z direction and its phase-sensitivity is proportional to tan(θ0). Panel (b) shows the experimental set-up for a stripe-light projected towards the solid under analysis. In light-stripe test (panel (b)) the phase-sensitivity is proportional to tan(φ0).

Download Full Size | PPT Slide | PDF

Assume that the 3D-surfaces enclosing the solids are expressed in 3D cylindrical coordinates (ρ,φ,z) as (see Fig. 1),

ρ=ρ(z,φ),ρ=x2+y2,z[L,L],φ[0,2π).
The 3D object under analysis lies within [L,L]in the z direction and within [0,2π)in the azimuthal φ direction (see Fig. 1). The projected linear fringes are aimed towards the 3D-surfaceρ(z,φ)having a phase-sensitivity angle θ0 (see Fig. 1(a)). Then the object is rotated N times an incremental azimuthal angle Δφ=2π/N (see Fig. 1(a)). For each increment Δφ=2π/None collects the pixels at the center column of the CCD (x = 0,z). In this way one generates an image composed by N-lines in the φ direction, each having the CCD’s pixel-column size in the z range[L,L]. Therefore the composed spatial-carrier fringe pattern is,
I(z,φ)=a(z,φ)+b(z,φ)cos[ω0z+gρ(z,φ)];ω0=v0cos(θ0),g=v0tan(θ0).
Where the spatial-carrier of the projected fringes is ν0, and assuming telecentric illumination and imaging, the spatial carrier at the CCD plane isω0=v0cos(θ0), and the phase-sensitivity of the profilometer isg=ν0tan(θ0). Finally the phasegρ(z,φ)in Eq. (2) is usually obtained using the Fourier carrier-frequency technique [1].

On the other hand a mathematical model for light-stripe profilometry may be (Fig. 1(b)),

I(x,z,φ)=δ[xgρ(z,φ)];g=tan(φ0).
This Dirac delta is phase-modulated by the 3D-object surfaceρ(z,φ). For each discrete rotationΔφ=2π/N, the light-stripe center at x=gρ(z,φ) is estimated by image intensity (not phase-measuring) algorithms and it is proportional to the 3D object’s topography. The main advantage of using a light-stripe is that many solids are fully illuminated (without self-generating shadows) from its highest pointz=+L, to its lowest onez=L. This full solid illumination is frequently impossible with the configuration in Fig. 1(a), because the elevation angle of the grating projector easily cast occluding shadows in the lower part of the 3D solid under analysis, precluding its digitizing.

3. Proposed improved profilometer for 360 degree solid digitizing

The improved 360-degree solid profilometry set-up is shown in Fig. 2. With this experimental set-up one maintains the higher phase resolution of interferometric phase-measuring methods while eliminating most self-generated object shadows cast by the 3D solid (see Fig. 1(a)). The 3D surfaces that can be digitized with our 360 degree profilometer belongs to the continuous, single-valued function space C1 in cylindrical coordinates(ρ,φ,z),

ρ=ρ(z,φ),ρ(z,φ)C1,ρ=x2+y2,z[L,L],φ[0,2π).
Any solid that can be enclosed with a 3D-surface within the function spaceρ(z,φ)C1 may be digitized with our 360 degree profilometer; this function space includes as a subset quasi-cylindrical surfacesρ(z,φ)[R,(1+ε)R],(R>0,ε1), as well as any topological convex solid. In Fig. 2, the CCD sensor is assumed to be parallel to the (x,z) plane. Therefore the 3D-surface imaged over the CCD plane has the following mathematical form,
I(x,z)=a(x,z)+b(x,z)cos[ω0x+gρ(x,z)];φ=0,ω0=v0cos(φ0),g=v0tan(φ0).
But we are only interested at the centered (x = 0) CCD column-pixelsI(0,z). In this way for a full azimuthal rotationφ[0,2π), one collects N–CCD columns at I(0,z,φ) as,

 figure: Fig. 2

Fig. 2 Proposed 4π steradians profilometer for 3D-surfaces within the space of ρ(z,φ)∈C1. The projected linear grating is in this case, along the z axis and orthogonal to previous 360-degree fringe-projection profilometers (Fig. 1(a)). As for the light-stripe (Fig. 1(b)) the phase-sensitivity of this 360-degree test is proportional to tan(φ0). This angle (φ0) must be kept large enough to increase the sensitivity of the test while avoiding lateral self-occluding shadowing.

Download Full Size | PPT Slide | PDF

I(0,z,φ)=a(0,z,φ)+b(0,z,φ)cos[ω00+gρ(0,z,φ)];g=v0tan(φ0),φ[0,2π).

From this equation, one may drop the mathematically “dummy” variable x = 0 obtaining,

I(z,φ)=a(z,φ)+b(z,φ)cos[gρ(z,φ)];x=0,z[L,L],φ[0,2π).
Equation (7) shows that the resulting fringe-pattern has no spatial carrier becauseω0x=ω00=0. So we need several phase-shifted fringe-patterns for phase demodulation of gρ(z,φ)C1.

To give an intuitive idea of the phase-shifted fringe patterns generated by our profilometer, assume that a sphere with radius L is being digitized. The sphere in cylindrical coordinates (ρ,φ,z) is,

ρ(z,φ)=L2z2;z[L,L],φ[0,2π),ρ(z,φ)C1.
Assuming telecentric projection and imaging optical systems, the carrier frequency fringe pattern at the CCD camera parallel to the (x,z) plane is shown in Fig. 3(a). In Figs. 3(b), 3(c), and 3(d) the phase-shifted fringe patterns correspond to,
I(z,φ,n)=a(z,φ)+b(z,φ)cos[gL2z2+n2π3];z[L,L],φ[0,2π),n={0,1,2}.
Phase-shifting demodulation of these 3 fringe patterns gives the digitized sphere’s phase in cylindrical coordinates; an element of the function spaceρ(z,φ)C1.

 figure: Fig. 3

Fig. 3 Panel (a) shows the spatial linear fringes along the z direction (Fig. 2) projected over the simulated sphere (Eq. (8)) as imaged over the CCD-(x,z) plane. Panels (b), (c) and (d) show the fringe patterns obtained when N-lines at (x = 0,z) over the CCD are collected while the sphere is rotated 2π radians along the coordinate φ∈[0.2π) (Eq. (9)). The phase-shift among these 3 fringe patterns (see Eq. (9)) is 2π/3 radians, and are phase-demodulated using a 3-step least-squares phase-shifting algorithm to obtain (z,φ).

Download Full Size | PPT Slide | PDF

4. Experimental results

Here we show the experimental results for our proposed 360-degree profilometry technique. As test 3D-surfaceρ(z,φ), we have chosen a (white-painted) Rubik’s cube mounted in its triangular base for its sharp angles and high frequency surface structure which clearly reveals the accuracy, high resolution and low noise of the proposed 360-degree digitizing technique.

We start by showing a white-light photograph of our Rubik’s cube in Fig. 4(a). In Fig. 4(b) we show the same cube illuminated with the linear grating along the z coordinate. The projection and imaging optical system are almost telecentric, and the camera’s CCD is parallel to the (x,z) plane.

 figure: Fig. 4

Fig. 4 In panel (a) we show the photograph of the (white-painted) Rubik’s cube used as 3D test-object. Panel (b) shows the cube with the projected fringes along the z direction.

Download Full Size | PPT Slide | PDF

Figure 5(a) and Fig. 5(b) show two (out-of 4) fringe patterns phase-modulated by the Rubik’s cylindrical coordinate representationρ(z,φ). Figure 5(a) has a phase-shift of 0-radians, while Fig. 5(b) has a phase-shift of π radians. The fact that the camera and the fringe projector both have their optical axis at the middle of the Rubik’s cube permits an entire (shadows-free) clear digitalization of the cube 3D-surface ρ(z,φ) (see Fig. 2). In general if the sensitivity angle (φ0) is not too high, and the 3D-surface belongs toρ(z,φ)C1, no self-generated shadows from the object under analysis are cast over the camera-view as in previous grating projection configurations (see Fig. 1(a)).

 figure: Fig. 5

Fig. 5 Panel (a) and (b) show two (out-of 4) phase-shifted, closed-fringe patterns of the Rubik’s cube in cylindrical coordinates (see Fig. 2). These two close-fringe patterns in (a) and (b) have 0, and π radian phase-shift respectively (the ones with π/2 and 3π/2 phase-shift are not shown). Panel (c) shows the wrapped phase of (z,φ) obtained by Eq. (10). Finally panel (d) shows the unwrapped phase of the cylindrical coordinate representation of the Rubik’s (z,φ).

Download Full Size | PPT Slide | PDF

Figure 5(c) shows the wrapped phase obtained according with the following least-squares 4-step phase-shifting algorithm [16],

Aeigρ(z,φ)=I(z,φ,0)+ei(2π4)I(z,φ,1)+eiπI(z,φ,2)+ei(32π4)I(z,φ,3).
Figure 5(d) shows the unwrapped phase of the Rubik’s cube in cylindrical coordinatesgρ(z,φ). The unwrapping of Fig. 5(c) was made by basic line-integration of wrapped phase differences. Finally Fig. 6 shows three different perspectives of the 360-degree digitized Rubik’s cubeρ(z,φ). As Fig. 6 shows, the Rubik’s cube is obtained with low phase-noise and high spatial resolution which translates into more visible surface details.

 figure: Fig. 6

Fig. 6 This figure shows 3 perspectives of the 4π steradian (360 degree) digitized Rubik’s cube

Download Full Size | PPT Slide | PDF

Of course higher spatial frequency surface reconstruction is possible by increasing the azimuthal Δφ=2π/N and spatial CCD pixel resolutions. Lower phase-noise demodulation is also possible by increasing the number M of phase-stepped images, Fig. 6 clearly shows interesting high frequency surface details of the Rubik’s cube.

Because the fringe patterns obtained have no spatial-carrier (see Eq. (7)) we may use the full spatial Fourier spectrum to reconstruct fine details of the 3D cube's surface. In spatial carrier interferometry one has at most half, but in practice, one third of the available Fourier frequency space to house the 3D surface spectrum. In contrast having close-fringes (base-band) fringe patterns (Figs. 5(a) and 5(b)) one has the full spectral bandwidth to recover fine surface details. This high-frequency surface reconstruction (Fig. 6) cannot be seen in previously published 360-degree approaches which render very smooth digital solid surfaces [215]. At the risk of becoming repetitive, we emphasize that, using base-band (closed-fringes) fringe patterns, allows one to reconstruct the digitized 3D-surfaces with the full theoretical spatial bandwidth that the raw digitized fringe data can hold.

5. Discussion of the advantages and limitations of the proposed 360-degree profilometer

In the peer-reviewing process we had many interesting questions coming from our reviewers and we feel that it is worth answering them within the paper because they elucidate and clarifies many interesting points not covered, or probably not fully explained so far.

  • a) Why this is a low-noise 360-degree profilometry technique? This 360-degree profilometer has low phase-noise because it uses base-band, phase-measuring, M-step demodulation algorithms. If white additive Gaussian noise corrupt the fringes, and least-squares M-step phase-shifting algorithms are used (as in this paper) we obtain an analytic signal with a noise-power-reduction of (1/M) with respect to the noise power of the digitized fringes [16]. In our case the analytical signal in Eq. (10) has a noise power reduction of (1/4) with respect to the additive noise power of each phase-stepped fringe imagesI(z,φ,n), n={0,1,2,3}.
  • b) How many digital CCD images are needed to obtain each phase-stepped cylindrical fringe image I(z,φ,n) of the solid? We need N CCD-images to assemble N central CCD-lines at x = 0 into a single fringe-imageI(z,φ,n). In our Rubik’s cube, for each rotation we took 400 CCD-lines per phase-stepped fringe-imageI(z,φ,n). The azimuthal resolution increment was Δφ = 2π/400.
  • c) Which unwrapping algorithm was used? Given the low-noise of the projected fringes and the 4-step least-squares phase-shifting demodulation used, we have employed the most basic phase unwrapping algorithm: line integration of wrapped phase differences.
  • d) How long does it take to capture a single phase-stepped fringe-pattern in Figs. 5(a) or 5(b)? The turntable was controlled by a stepping motor having an incremental angle Δφ = 2π/400. It takes about 2 seconds for each full turntable rotation, during which we grab 400 central CCD pixel-lines at x = 0. So to capture the 4 phase-stepped fringe-pattern image-data takes about 8 seconds.
  • e) Which is the resolution of each phase-stepped cylindrical fringe-image I(z,φ,n) of the Rubik’s cube in Fig. 5? The angular and spatial resolution for each image I(z,φ,n) in this paper isz{1,2,3...,480], and φ{0,Δφ,2Δφ,...,400Δφ}with Δφ = 2π/400 and a CCD camera with 640x480 pixels.
  • f) Why this 360 degree method is high resolution? The spatial 3D surface resolution is only limited by the CCD's vertical resolution, the azimuthal angle resolution Δφ = 2π/N of the stepped turntable motor and the mechanical stability of the turntable. In our Rubik's experiment the spatial resolution was (480,400) surface pixels to representρ(z,φ) in cylindrical coordinates.
  • g) Self-generated shadow-free digitalization is only possible for topological convex solids, isn’t it? Topological convex 3D-surfaces form a proper subset ofρ(z,φ)C1. Any solid enclosed by a single-valued 3D-surface ρ(z,φ)C1 (see Eq. (4)) may be digitized using the proposed 360-degree technique. The 3D Rubik’s cube in Fig. 4 is not entirely convex because of its mounting triangular base. There are points over the 3D surface of the cube-base compound which cannot be joined by a straight line lying entirely within this composed solid (convexity definition). The Rubik's cube-plus-base compound is a single-valuedρ(z,φ)C1surface except for the bottom of the triangle-base, which could not be digitized.
  • h) Why quasi-cylindrical objects are easier to digitize? In the extreme case of a 3D cylinder,ρ(z,φ)=R,z[L,L],φ[0,2π), a single CCD-line at x = 0 would define the entire cylinder for every azimuthal angle φ. Continuous quasi-cylindrical objects ρ(z,φ)[R,(1+ε)R] (R>0,ε1.0), are always easier to digitize and form a small subset of the broader function space ρ(z,φ)C1.
  • i) Which is the most basic mathematical condition of 3D-surfaces that this method can digitize? As we said, the basic mathematical assumption is that the 3D boundary-surface must be continuous and single-valued functionρ(z,φ)C1, z[L,L], φ[0,2π). For example a coffee-mug 3D boundary-surface is not within the spaceρ(z,φ)C1. That is because at the handle, the mug has two-valued cylindrical radiusesρ(z1,φ1)=ρ1andρ(z1,φ1)=ρ2, one surface (i.e.ρ(z1,φ1)=ρ1) occludes the other one. Digitize multi-valued cylindrical functions ρ(z,φ) such a donut-like genus-1 solids is impossible with the proposed profilometer. One would necessarily need multiple camera perspectives, not a single one.
  • j) Why it is claimed that the projection-camera system proposed is new?, isn't it the one used in standard single stripe-line 360-degree profilometry?. That is true, however our system uses more robust, less-noisy coherent, phase-stepping demodulation algorithms while maintaining the shadow-free (forρ(z,φ)C1) advantages of the projector-camera geometry of single-line projection 360-degree profilometry.
  • k) Is fringe-image fusion from different perspectives needed? Our method uses a straightforward and trivial line-by-line raw fringe-data assembling to obtain the digitized 3D-surfaceρ(z,φ)C1within a single cylindrical coordinate system (ρ,z,φ), so no complex image fusion is needed. This is an advantage with respect to some multi-view 360-degree profilometers that must fusion or blend different fringe-image perspectives residing in different coordinate systems into a common coordinate system using iterative closest point (ICP) algorithms (e.g. Geomagic). The approach herein proposed only measures each object point once and thereby the fusion problem from different fringe-patterns perspectives is eliminated and the generation of the final 3D-surface ρ(z,φ)C1 is very easy.
  • l) How to understand x = 0 in Eq. (7) ? The pixel-line coordinates given by the line (x = 0,z) is the central CCD-column. We keep this line for each azimuthal incrementΔφ=2π/N. As mentioned, we then assemble in a line-by-line basis, N CCD-columns with coordinates (x=0,z,φ)forz[L,L], andφ[0,2π) to obtain each phase-shifted fringe-pattern I(z,φ,n)in cylindrical coordinates (see Figs. 5(a) and 5(b)).

Finally as far as we know, this is the first time that a more precise mathematical definition of the functional space where the solid bounding-surfaces must reside is given. This functional space is the continuous, single-valued spaceρ(z,φ)C1. The functional space ρ(z,φ)C1 includes all topological convex solids as a proper subset. However the spaceρ(z,φ)C1 fails to include more complex 3D-surfaces such as the boundary of a coffee-mug for example. That is because the function space ρ(z,φ)C1does not include multi-valued (in cylindrical coordinates) boundary surfaces such as a coffee-mug which is a topological genus-1 3D-surface.

6. Conclusions

Here we have presented a new experimental set-up along with its theoretical analysis for 360 degree digitizing of solids. We have seen that whenever the 3D enclosing surface of the digitizing solid is within the single-valued, function spaceρ(z,φ)C1, it can be digitized accurately and without self-occluding shadows. We have chosen as experimental solid-test a Rubik’s cube positioned over its triangular base for its sharp corners and high frequency surface details. As seen from the experimental results (Figs. 4, 5 and 6) we are capable of digitizing most of the non-convex Rubik’s cube-plus-base compound without self-generated shadows, because the CCD camera and the linear grating projector have both their optical axis crossing the center of the cube-base compound (Fig. (2)). In contrast, previous grating approaches (see Fig. 1(a)) many times generate self-occluding shadows even for convex solids precluding the analysis of the lowest parts of the object. On the other hand stripe-light triangulation estimation precludes the use of more accurate phase-demodulation techniques, such as the least-squares phase-step algorithm used in this work. Finally having base-band fringe patterns (Figs. 5(a) and 5(b)) one keeps the full spatial bandwidth of the raw digitized data for 3D-surface digital reconstruction. This allows visualizing higher frequency 3D-surface details. In other words, closed-fringes pattern images, allows one to analyze the digitized solids with the highest possible surface bandwidth available from the raw fringe data.

Acknowledgments

The authors would like to acknowledge the financial support from project 177044, issue by the Mexican Consejo Nacional de Ciencia y Tecnologia (CONACYT).

References and links

1. M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. A 72, l56–l60 (1982).

2. M. Halioua, R. S. Krishnamurthy, H. C. Liu, and F. P. Chiang, “Automated 360 ° profilometry of 3-D diffuse objects,” Appl. Opt. 24(14), 2193–2196 (1985). [CrossRef]   [PubMed]  

3. X. X. Cheng, X. Y. Su, and L. R. Guo, “Automated measurement method for 360 ° profilometry of 3-D diffuse objects,” Appl. Opt. 30(10), 1274–1278 (1991). [CrossRef]   [PubMed]  

4. A. K. Asundi, “360-deg profilometry: new techniques for display and acquisition,” Opt. Eng. 33(8), 2760–2769 (1994). [CrossRef]  

5. M. Chang and W. C. Tai, “360-deg profile noncontact measurement using a neural network,” Opt. Eng. 34(12), 3572–3576 (1995). [CrossRef]  

6. A. S. Gomes, L. A. Serra, A. S. Lage, and A. Gomes, “Automated 360° degree profilometry of human trunk for spinal deformity analysis,” in Proceedings of Three Dimensional Analysis of Spinal Deformities, M. Damico et al. eds., (IOS, Burke, 1995), pp. 423–429.

7. Y. Song, H. Zhao, W. Chen, and Y. Tan, “360 degree 3D profilometry,” Proc. SPIE 3204, 204–208 (1997). [CrossRef]  

8. A. Asundi and W. Zhou, “Mapping algorithm for 360-deg profilometry with time delayed integration imaging,” Opt. Eng. 38(2), 339–344 (1999). [CrossRef]  

9. X. Su and W. Chen, “Fourier transform profilometry,” Opt. Lasers Eng. 35(5), 263–284 (2001). [CrossRef]  

10. X. Zhang, P. Sun, and H. Wang, “A new 360 rotation profilometry and its application in engine design,” Proc. SPIE 4537, 265–268 (2002). [CrossRef]  

11. J. A. Munoz-Rodriguez, A. Asundi, and R. Rodriguez-Vera, “Recognition of a light line pattern by Hu moments for 3-D reconstruction of a rotated object,” Opt. Laser Technol. 37(2), 131–138 (2005). [CrossRef]  

12. G. Tmjillo-Schiaffino, N. Portillo-Amavisca, D. P. Salas-Peimbert, L. Molina-de la Rosa, S. Almazan-Cuellarand, and L. F. Corral-Martinez, “Three-dimensional profilometry of solid objects in rotation,” in AIP Proceedings 992, 924–928 (2008).

13. B. Shi, B. Zhang, F. Liu, J. Luo, and J. Bai, “360° Fourier transform profilometry in surface reconstruction for fluorescence molecular tomography,” IEEE J Biomed Health Inform 17(3), 681–689 (2013). [CrossRef]   [PubMed]  

14. Y. Zhang and G. Bu, “Automatic 360-deg profilometry of a 3D object using a shearing interferometer and virtual grating,” Proc. SPIE 2899, 162–169 (1996). [CrossRef]  

15. Z. Zhang, H. Ma, S. Zhang, T. Guo, C. E. Towers, and D. P. Towers, “Simple calibration of a phase-based 3D imaging system based on uneven fringe projection,” Opt. Lett. 36(5), 627–629 (2011). [CrossRef]   [PubMed]  

16. M. Servin and J. C. Estrada, “Analysis and synthesis of phase shifting algorithms based on linear systems theory,” Opt. Lasers Eng. 50(8), 1009–1014 (2012). [CrossRef]  

References

  • View by:
  • |
  • |
  • |

  1. M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. A 72, l56–l60 (1982).
  2. M. Halioua, R. S. Krishnamurthy, H. C. Liu, and F. P. Chiang, “Automated 360 ° profilometry of 3-D diffuse objects,” Appl. Opt. 24(14), 2193–2196 (1985).
    [Crossref] [PubMed]
  3. X. X. Cheng, X. Y. Su, and L. R. Guo, “Automated measurement method for 360 ° profilometry of 3-D diffuse objects,” Appl. Opt. 30(10), 1274–1278 (1991).
    [Crossref] [PubMed]
  4. A. K. Asundi, “360-deg profilometry: new techniques for display and acquisition,” Opt. Eng. 33(8), 2760–2769 (1994).
    [Crossref]
  5. M. Chang and W. C. Tai, “360-deg profile noncontact measurement using a neural network,” Opt. Eng. 34(12), 3572–3576 (1995).
    [Crossref]
  6. A. S. Gomes, L. A. Serra, A. S. Lage, and A. Gomes, “Automated 360° degree profilometry of human trunk for spinal deformity analysis,” in Proceedings of Three Dimensional Analysis of Spinal Deformities, M. Damico et al. eds., (IOS, Burke, 1995), pp. 423–429.
  7. Y. Song, H. Zhao, W. Chen, and Y. Tan, “360 degree 3D profilometry,” Proc. SPIE 3204, 204–208 (1997).
    [Crossref]
  8. A. Asundi and W. Zhou, “Mapping algorithm for 360-deg profilometry with time delayed integration imaging,” Opt. Eng. 38(2), 339–344 (1999).
    [Crossref]
  9. X. Su and W. Chen, “Fourier transform profilometry,” Opt. Lasers Eng. 35(5), 263–284 (2001).
    [Crossref]
  10. X. Zhang, P. Sun, and H. Wang, “A new 360 rotation profilometry and its application in engine design,” Proc. SPIE 4537, 265–268 (2002).
    [Crossref]
  11. J. A. Munoz-Rodriguez, A. Asundi, and R. Rodriguez-Vera, “Recognition of a light line pattern by Hu moments for 3-D reconstruction of a rotated object,” Opt. Laser Technol. 37(2), 131–138 (2005).
    [Crossref]
  12. G. Tmjillo-Schiaffino, N. Portillo-Amavisca, D. P. Salas-Peimbert, L. Molina-de la Rosa, S. Almazan-Cuellarand, and L. F. Corral-Martinez, “Three-dimensional profilometry of solid objects in rotation,” in AIP Proceedings 992, 924–928 (2008).
  13. B. Shi, B. Zhang, F. Liu, J. Luo, and J. Bai, “360° Fourier transform profilometry in surface reconstruction for fluorescence molecular tomography,” IEEE J Biomed Health Inform 17(3), 681–689 (2013).
    [Crossref] [PubMed]
  14. Y. Zhang and G. Bu, “Automatic 360-deg profilometry of a 3D object using a shearing interferometer and virtual grating,” Proc. SPIE 2899, 162–169 (1996).
    [Crossref]
  15. Z. Zhang, H. Ma, S. Zhang, T. Guo, C. E. Towers, and D. P. Towers, “Simple calibration of a phase-based 3D imaging system based on uneven fringe projection,” Opt. Lett. 36(5), 627–629 (2011).
    [Crossref] [PubMed]
  16. M. Servin and J. C. Estrada, “Analysis and synthesis of phase shifting algorithms based on linear systems theory,” Opt. Lasers Eng. 50(8), 1009–1014 (2012).
    [Crossref]

2013 (1)

B. Shi, B. Zhang, F. Liu, J. Luo, and J. Bai, “360° Fourier transform profilometry in surface reconstruction for fluorescence molecular tomography,” IEEE J Biomed Health Inform 17(3), 681–689 (2013).
[Crossref] [PubMed]

2012 (1)

M. Servin and J. C. Estrada, “Analysis and synthesis of phase shifting algorithms based on linear systems theory,” Opt. Lasers Eng. 50(8), 1009–1014 (2012).
[Crossref]

2011 (1)

2005 (1)

J. A. Munoz-Rodriguez, A. Asundi, and R. Rodriguez-Vera, “Recognition of a light line pattern by Hu moments for 3-D reconstruction of a rotated object,” Opt. Laser Technol. 37(2), 131–138 (2005).
[Crossref]

2002 (1)

X. Zhang, P. Sun, and H. Wang, “A new 360 rotation profilometry and its application in engine design,” Proc. SPIE 4537, 265–268 (2002).
[Crossref]

2001 (1)

X. Su and W. Chen, “Fourier transform profilometry,” Opt. Lasers Eng. 35(5), 263–284 (2001).
[Crossref]

1999 (1)

A. Asundi and W. Zhou, “Mapping algorithm for 360-deg profilometry with time delayed integration imaging,” Opt. Eng. 38(2), 339–344 (1999).
[Crossref]

1997 (1)

Y. Song, H. Zhao, W. Chen, and Y. Tan, “360 degree 3D profilometry,” Proc. SPIE 3204, 204–208 (1997).
[Crossref]

1996 (1)

Y. Zhang and G. Bu, “Automatic 360-deg profilometry of a 3D object using a shearing interferometer and virtual grating,” Proc. SPIE 2899, 162–169 (1996).
[Crossref]

1995 (1)

M. Chang and W. C. Tai, “360-deg profile noncontact measurement using a neural network,” Opt. Eng. 34(12), 3572–3576 (1995).
[Crossref]

1994 (1)

A. K. Asundi, “360-deg profilometry: new techniques for display and acquisition,” Opt. Eng. 33(8), 2760–2769 (1994).
[Crossref]

1991 (1)

1985 (1)

1982 (1)

M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. A 72, l56–l60 (1982).

Asundi, A.

J. A. Munoz-Rodriguez, A. Asundi, and R. Rodriguez-Vera, “Recognition of a light line pattern by Hu moments for 3-D reconstruction of a rotated object,” Opt. Laser Technol. 37(2), 131–138 (2005).
[Crossref]

A. Asundi and W. Zhou, “Mapping algorithm for 360-deg profilometry with time delayed integration imaging,” Opt. Eng. 38(2), 339–344 (1999).
[Crossref]

Asundi, A. K.

A. K. Asundi, “360-deg profilometry: new techniques for display and acquisition,” Opt. Eng. 33(8), 2760–2769 (1994).
[Crossref]

Bai, J.

B. Shi, B. Zhang, F. Liu, J. Luo, and J. Bai, “360° Fourier transform profilometry in surface reconstruction for fluorescence molecular tomography,” IEEE J Biomed Health Inform 17(3), 681–689 (2013).
[Crossref] [PubMed]

Bu, G.

Y. Zhang and G. Bu, “Automatic 360-deg profilometry of a 3D object using a shearing interferometer and virtual grating,” Proc. SPIE 2899, 162–169 (1996).
[Crossref]

Chang, M.

M. Chang and W. C. Tai, “360-deg profile noncontact measurement using a neural network,” Opt. Eng. 34(12), 3572–3576 (1995).
[Crossref]

Chen, W.

X. Su and W. Chen, “Fourier transform profilometry,” Opt. Lasers Eng. 35(5), 263–284 (2001).
[Crossref]

Y. Song, H. Zhao, W. Chen, and Y. Tan, “360 degree 3D profilometry,” Proc. SPIE 3204, 204–208 (1997).
[Crossref]

Cheng, X. X.

Chiang, F. P.

Estrada, J. C.

M. Servin and J. C. Estrada, “Analysis and synthesis of phase shifting algorithms based on linear systems theory,” Opt. Lasers Eng. 50(8), 1009–1014 (2012).
[Crossref]

Guo, L. R.

Guo, T.

Halioua, M.

Ina, H.

M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. A 72, l56–l60 (1982).

Kobayashi, S.

M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. A 72, l56–l60 (1982).

Krishnamurthy, R. S.

Liu, F.

B. Shi, B. Zhang, F. Liu, J. Luo, and J. Bai, “360° Fourier transform profilometry in surface reconstruction for fluorescence molecular tomography,” IEEE J Biomed Health Inform 17(3), 681–689 (2013).
[Crossref] [PubMed]

Liu, H. C.

Luo, J.

B. Shi, B. Zhang, F. Liu, J. Luo, and J. Bai, “360° Fourier transform profilometry in surface reconstruction for fluorescence molecular tomography,” IEEE J Biomed Health Inform 17(3), 681–689 (2013).
[Crossref] [PubMed]

Ma, H.

Munoz-Rodriguez, J. A.

J. A. Munoz-Rodriguez, A. Asundi, and R. Rodriguez-Vera, “Recognition of a light line pattern by Hu moments for 3-D reconstruction of a rotated object,” Opt. Laser Technol. 37(2), 131–138 (2005).
[Crossref]

Rodriguez-Vera, R.

J. A. Munoz-Rodriguez, A. Asundi, and R. Rodriguez-Vera, “Recognition of a light line pattern by Hu moments for 3-D reconstruction of a rotated object,” Opt. Laser Technol. 37(2), 131–138 (2005).
[Crossref]

Servin, M.

M. Servin and J. C. Estrada, “Analysis and synthesis of phase shifting algorithms based on linear systems theory,” Opt. Lasers Eng. 50(8), 1009–1014 (2012).
[Crossref]

Shi, B.

B. Shi, B. Zhang, F. Liu, J. Luo, and J. Bai, “360° Fourier transform profilometry in surface reconstruction for fluorescence molecular tomography,” IEEE J Biomed Health Inform 17(3), 681–689 (2013).
[Crossref] [PubMed]

Song, Y.

Y. Song, H. Zhao, W. Chen, and Y. Tan, “360 degree 3D profilometry,” Proc. SPIE 3204, 204–208 (1997).
[Crossref]

Su, X.

X. Su and W. Chen, “Fourier transform profilometry,” Opt. Lasers Eng. 35(5), 263–284 (2001).
[Crossref]

Su, X. Y.

Sun, P.

X. Zhang, P. Sun, and H. Wang, “A new 360 rotation profilometry and its application in engine design,” Proc. SPIE 4537, 265–268 (2002).
[Crossref]

Tai, W. C.

M. Chang and W. C. Tai, “360-deg profile noncontact measurement using a neural network,” Opt. Eng. 34(12), 3572–3576 (1995).
[Crossref]

Takeda, M.

M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. A 72, l56–l60 (1982).

Tan, Y.

Y. Song, H. Zhao, W. Chen, and Y. Tan, “360 degree 3D profilometry,” Proc. SPIE 3204, 204–208 (1997).
[Crossref]

Towers, C. E.

Towers, D. P.

Wang, H.

X. Zhang, P. Sun, and H. Wang, “A new 360 rotation profilometry and its application in engine design,” Proc. SPIE 4537, 265–268 (2002).
[Crossref]

Zhang, B.

B. Shi, B. Zhang, F. Liu, J. Luo, and J. Bai, “360° Fourier transform profilometry in surface reconstruction for fluorescence molecular tomography,” IEEE J Biomed Health Inform 17(3), 681–689 (2013).
[Crossref] [PubMed]

Zhang, S.

Zhang, X.

X. Zhang, P. Sun, and H. Wang, “A new 360 rotation profilometry and its application in engine design,” Proc. SPIE 4537, 265–268 (2002).
[Crossref]

Zhang, Y.

Y. Zhang and G. Bu, “Automatic 360-deg profilometry of a 3D object using a shearing interferometer and virtual grating,” Proc. SPIE 2899, 162–169 (1996).
[Crossref]

Zhang, Z.

Zhao, H.

Y. Song, H. Zhao, W. Chen, and Y. Tan, “360 degree 3D profilometry,” Proc. SPIE 3204, 204–208 (1997).
[Crossref]

Zhou, W.

A. Asundi and W. Zhou, “Mapping algorithm for 360-deg profilometry with time delayed integration imaging,” Opt. Eng. 38(2), 339–344 (1999).
[Crossref]

Appl. Opt. (2)

IEEE J Biomed Health Inform (1)

B. Shi, B. Zhang, F. Liu, J. Luo, and J. Bai, “360° Fourier transform profilometry in surface reconstruction for fluorescence molecular tomography,” IEEE J Biomed Health Inform 17(3), 681–689 (2013).
[Crossref] [PubMed]

J. Opt. Soc. Am. A (1)

M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. A 72, l56–l60 (1982).

Opt. Eng. (3)

A. Asundi and W. Zhou, “Mapping algorithm for 360-deg profilometry with time delayed integration imaging,” Opt. Eng. 38(2), 339–344 (1999).
[Crossref]

A. K. Asundi, “360-deg profilometry: new techniques for display and acquisition,” Opt. Eng. 33(8), 2760–2769 (1994).
[Crossref]

M. Chang and W. C. Tai, “360-deg profile noncontact measurement using a neural network,” Opt. Eng. 34(12), 3572–3576 (1995).
[Crossref]

Opt. Laser Technol. (1)

J. A. Munoz-Rodriguez, A. Asundi, and R. Rodriguez-Vera, “Recognition of a light line pattern by Hu moments for 3-D reconstruction of a rotated object,” Opt. Laser Technol. 37(2), 131–138 (2005).
[Crossref]

Opt. Lasers Eng. (2)

M. Servin and J. C. Estrada, “Analysis and synthesis of phase shifting algorithms based on linear systems theory,” Opt. Lasers Eng. 50(8), 1009–1014 (2012).
[Crossref]

X. Su and W. Chen, “Fourier transform profilometry,” Opt. Lasers Eng. 35(5), 263–284 (2001).
[Crossref]

Opt. Lett. (1)

Proc. SPIE (3)

Y. Zhang and G. Bu, “Automatic 360-deg profilometry of a 3D object using a shearing interferometer and virtual grating,” Proc. SPIE 2899, 162–169 (1996).
[Crossref]

X. Zhang, P. Sun, and H. Wang, “A new 360 rotation profilometry and its application in engine design,” Proc. SPIE 4537, 265–268 (2002).
[Crossref]

Y. Song, H. Zhao, W. Chen, and Y. Tan, “360 degree 3D profilometry,” Proc. SPIE 3204, 204–208 (1997).
[Crossref]

Other (2)

A. S. Gomes, L. A. Serra, A. S. Lage, and A. Gomes, “Automated 360° degree profilometry of human trunk for spinal deformity analysis,” in Proceedings of Three Dimensional Analysis of Spinal Deformities, M. Damico et al. eds., (IOS, Burke, 1995), pp. 423–429.

G. Tmjillo-Schiaffino, N. Portillo-Amavisca, D. P. Salas-Peimbert, L. Molina-de la Rosa, S. Almazan-Cuellarand, and L. F. Corral-Martinez, “Three-dimensional profilometry of solid objects in rotation,” in AIP Proceedings 992, 924–928 (2008).

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1
Fig. 1 Panel (a) shows the well known set-up when a linear grating projector is used. The linear grating is orthogonal to the z direction and its phase-sensitivity is proportional to tan(θ0). Panel (b) shows the experimental set-up for a stripe-light projected towards the solid under analysis. In light-stripe test (panel (b)) the phase-sensitivity is proportional to tan(φ0).
Fig. 2
Fig. 2 Proposed 4π steradians profilometer for 3D-surfaces within the space of ρ(z,φ)∈C1. The projected linear grating is in this case, along the z axis and orthogonal to previous 360-degree fringe-projection profilometers (Fig. 1(a)). As for the light-stripe (Fig. 1(b)) the phase-sensitivity of this 360-degree test is proportional to tan(φ0). This angle (φ0) must be kept large enough to increase the sensitivity of the test while avoiding lateral self-occluding shadowing.
Fig. 3
Fig. 3 Panel (a) shows the spatial linear fringes along the z direction (Fig. 2) projected over the simulated sphere (Eq. (8)) as imaged over the CCD-(x,z) plane. Panels (b), (c) and (d) show the fringe patterns obtained when N-lines at (x = 0,z) over the CCD are collected while the sphere is rotated 2π radians along the coordinate φ∈[0.2π) (Eq. (9)). The phase-shift among these 3 fringe patterns (see Eq. (9)) is 2π/3 radians, and are phase-demodulated using a 3-step least-squares phase-shifting algorithm to obtain (z,φ).
Fig. 4
Fig. 4 In panel (a) we show the photograph of the (white-painted) Rubik’s cube used as 3D test-object. Panel (b) shows the cube with the projected fringes along the z direction.
Fig. 5
Fig. 5 Panel (a) and (b) show two (out-of 4) phase-shifted, closed-fringe patterns of the Rubik’s cube in cylindrical coordinates (see Fig. 2). These two close-fringe patterns in (a) and (b) have 0, and π radian phase-shift respectively (the ones with π/2 and 3π/2 phase-shift are not shown). Panel (c) shows the wrapped phase of (z,φ) obtained by Eq. (10). Finally panel (d) shows the unwrapped phase of the cylindrical coordinate representation of the Rubik’s (z,φ).
Fig. 6
Fig. 6 This figure shows 3 perspectives of the 4π steradian (360 degree) digitized Rubik’s cube

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

ρ=ρ(z,φ),ρ= x 2 + y 2 ,z[L,L],φ[0,2π).
I(z,φ)=a(z,φ)+b(z,φ)cos[ ω 0 z+gρ(z,φ) ]; ω 0 = v 0 cos( θ 0 ),g= v 0 tan( θ 0 ).
I(x,z,φ)=δ[ xgρ(z,φ) ];g=tan( φ 0 ).
ρ=ρ(z,φ),ρ(z,φ) C 1 ,ρ= x 2 + y 2 ,z[L,L],φ[0,2π).
I(x,z)=a(x,z)+b(x,z)cos[ ω 0 x+gρ(x,z) ];φ=0, ω 0 = v 0 cos( φ 0 ),g= v 0 tan( φ 0 ).
I(0,z,φ)=a(0,z,φ)+b(0,z,φ)cos[ ω 0 0+gρ(0,z,φ) ];g= v 0 tan( φ 0 ),φ[0,2π).
I(z,φ)=a(z,φ)+b(z,φ)cos[ gρ(z,φ) ];x=0,z[L,L],φ[0,2π).
ρ(z,φ)= L 2 z 2 ;z[L,L],φ[0,2π),ρ(z,φ) C 1 .
I(z,φ,n)=a(z,φ)+b(z,φ)cos[ g L 2 z 2 +n 2π 3 ];z[L,L],φ[0,2π),n={0,1,2}.
A e igρ(z,φ) =I(z,φ,0)+ e i( 2π 4 ) I( z,φ,1 )+ e iπ I(z,φ,2)+ e i( 3 2π 4 ) I( z,φ,3 ).

Metrics