Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Projected fringe profilometry using the area-encoded algorithm for spatially isolated and dynamic objects

Open Access Open Access

Abstract

We present a discussion on how an area-encoded fringe pattern is applied to describe the 3D shape of objects that have spatially isolated surfaces. Phases of the fringes can be carried out without ambiguity to retrieve the 3D shape. Compared with conventional fringe projection techniques, the proposed scheme is relatively reliable and robust to identify the fringe order. Only one phase measurement is required. This makes it possible to analyze dynamic objects.

©2008 Optical Society of America

1. Introduction

Projecting a fringe pattern onto an inspected object and then observing the deformation at a different view angle to retrieve the 3D shape is a popular method for 3D sensing techniques [1-3]. They are advantageous for the non-scanning nature and full-field performance. Therefore, to inspect dynamic objects is desirable. In such a situation, phases are usually evaluated by Fourier transforms method [1] rather than by phase-shifting approaches [2]. The reason is that only one shot-measurement is required for Fourier transform method to perform the phase-extraction. The phase-shifting approaches are not appropriate because any movement between two subsequent measurements would result in erroneous identification. However, when objects of interest are too complicated, part of the inspected surfaces cannot offer sufficient information to reconstruct the 3D shapes. It is difficult to retrieve the absolute phases for spatially isolated surfaces, such as a group of fishes. Phase unwrapping [4, 5] will encounter ambiguity to identify the local fringe order. This causes the depth difference between two spatially isolated surfaces indiscernible.

Several projection algorithms have been proposed to correctly unwrap the phases [6-12], but they are not suited for dynamic inspection due to their time-sequential nature. Algorithms based on spatially algorithms [13,14] are more practical, since only one single-shot measurement is required. However, the spatial algorithms still encounter several problems to retrieve the 3D shapes. These algorithms generally contain two or more sets of fringes with different frequencies in a pattern to perform the measurement. The tolerance to identify the fringe order can be enlarged by increasing the number of fringes in one period [14]. Unfortunately, it makes the contrast of the periodic pattern worse and introduces errors to evaluate the phases. Besides, ambiguity to identify the fringe order still occurs when the depth discontinuities is much larger than the tolerance of these unwrapping algorithms.

An alternated solution to analyze spatially isolated objects is using the structured encoded pattern [15-17] instead of the fringe pattern. The structured pattern can be either a set of square patches or a set of parallel grids. These patterns are spatially encoded with gray-scales [15] or colors [16, 17]. Each patch or grid is then distinguishable by this encoded scheme. However, systematic accuracy is limited by the size of the patches or width of the grids. To enhance the accuracy, our previous work had proposed an improved projection algorithm by combining the color-encoded grids with sinusoidal fringes [18]. Phase is evaluated by Fourier transform method. Unwrapping is then performed with reference to the color-encoded scheme. The limitation is that signal-to-noise ratio might be too low on the boundary between two colorful fringes. Colors close to the boundary are indiscernible if the surface reflectivity is relatively low. Errors are therefore introduced to identify the fringe order on that boundary. Besides, this encoding scheme is sensitive to the observed colors. Fringe orders on objects with a pure primary color (i.e., red, green, or blue) cannot be identified with the color-encoded scheme.

The purpose of this paper is to investigate an improvement on the above methods that can reconstruct the 3D shape for a group of dynamic and spatially isolated objects. We made a structured pattern in which the sinusoidal fringes are both encoded with binary stripes and colorful grids. The binary stripes plays a key role to identify the local fringe order, while the colorful grids provides additional degree of freedom to identify the stripes. Even though the inspected surfaces are colorful, the proposed encoding scheme can still identify the local fringe orders. Compared with our previous work, this encoding scheme provides a more reliable performance to identify the fringe orders since it is not sensitive to the observed colors.

2. Properties of area encoded sinusoidal fringes

The projected fringes are encoded with colors and binary stripes. Figure 1 shows such an example. This pattern can further divided to three parts, including: (a) a set of color grids, (b) a set of binary stripes, and (c) a set of sinusoidal fringes.

 figure: Fig. 1.

Fig. 1. Appearance of a transmittance-encoded fringe pattern.

Download Full Size | PDF

The transmittance of this encoded pattern is mathematically represented as

t(x)=C(x)·B(x)·[0.4+0.4cos(2πdx)]+0.2,

where C(x) is the distribution of color grids, B(x) is the transmittance of binary stripes, and d is the period of sinusoidal fringes. Figure 2(a), (b), and (c) show the distribution of binary stripes, color grids, and sinusoidal fringes, respectively. Note that the width of color grids is equal to 10 times that of binary stripes, while the period of sinusoidal fringe is equal to the width of encoded strips.

 figure: Fig. 2.

Fig. 2. Color arrangement of the encoding scheme.

Download Full Size | PDF

As shown in Fig. 2(a), transmittance of the stripes is quantized into two intensity levels, 0.5 and 1. There are 10 binary stripes in a period. The sequence of any three adjacent transmittances within one period does not appear elsewhere. Thus, any stripe in that period can be identified without ambiguity. For example, three adjacent stripes with sequent transmittances of 0.5, 1.0, and 0.5 are utilized to represent the 1st, 2nd, and 3rd stripe in that period, respectively. Stripes are therefore discernible with reference to its neighbors.

To increase the number of fringes in one period, stripes can further be assigned with desired colors. As shown in Fig. 2(b), seven colors are selected to address the pattern. These colors are: red, green, blue, white, yellow, magenta, and cyan. In such an encoding way, 10 binary stripes in a period are considered as one group. Each group is then identified with a specific color. For example, fringes from the 1st one to the 10th are encoded with the blue color. Fringes from the 51st to the 60th are encoded with the yellow. Thus, searching for the sequence of the groups is simplified as to find out the distribution of the colors. The entire stripes shown in Fig. 2(a) becomes distinguishable with reference to the color grids. There are 70 stripes in one period, which is large enough for most measurements. Finally, as depicted in Fig. 1, each sinusoidal fringe becomes distinguishable with reference to the binary stripes and the colors.

Figure 2(c) shows the distribution of the sinusoidal fringes, with transmittance ranging from 0 to 0.8. A combination of the binary stripes and sinusoidal fringes in one stripe period is shown in Fig. 3, where the overall transmittance has been added by 0.2. When this pattern is embed to a set of color grids, the minimum intensity is 0.2. Thus, the compound colors are detectable at each pixel.

 figure: Fig. 3.

Fig. 3. Sinusoidal fringes encoded with binary stripes.

Download Full Size | PDF

Since the width of the colorful grids is much larger than that of the binary stripes, the binary stripes play a main role to identify the local fringe order. Colors only are used to address the group of the stripes. Thus, this encoding scheme are not sensitive to noises from the color.

3. Experimental setup and calibrations

Figure 4 shows the geometrical arrangement of a typical non-collimated fringe projection. An area-encoded pattern designed on a computer is projected to the objects by the LCD projector.

 figure: Fig. 4.

Fig. 4. Optical configuration for 3D shape measurements.

Download Full Size | PDF

The deformed fringe pattern on the objects is then recorded by a color CCD camera at a different view angle. To evaluate the phase distribution, the color-encoded grids should be displayed in gray intensity levels. It is a simple task for image processing. In our setup, phases of the projected fringes are evaluated by Fourier transform method [1]. Unwrapping is then performed with reference to the area-encoded stripes, as described in section 2.

Of course, a correspondence between the world coordinates and retrieved unwrapped phases on the inspected objects must be established. The mathematical relationship can be expressed as

{z=n=0Ncnφnx=a1z+a0y=b1z+b0,

where ai, bi, and ci are undetermined parameters. The functional form of Eq. (2) and all the coefficients involved must be found prior to the 3D measurement. Several calibration schemes have been proposed to find out these parameters [19-22]. Once these parameters are known, one can first identify the depth value z from the deformed phase on the inspected object. Then the corresponding transverse positions can be carried out by the depth value.

4. Experiments

Two fish were selected as the measured sample. An area-encoded pattern designed on a computer is projected to the objects by the LCD projector. The fringe pattern used to perform the measurement is depicted in Fig. 5. The minimum transmittance of this pattern was 0.2. Thus, all the color stripes were observable.

 figure: Fig. 5.

Fig. 5. Appearance of area encoded sinusoidal fringes.

Download Full Size | PDF

 figure: Fig. 6.

Fig. 6. 3D sensing to spatially isolated objects: (a) appearance of projected fringes on the tested samples; (b) phase-extraction by Fourier transform method; (c) unwrapped phase identified by the proposed scheme; and (d) the reconstructed 3D shape.

Download Full Size | PDF

A color CCD camera with 1024×768 pixels at 24-bit pixel resolution was used to record the projected fringes. The recorded image in which an area-encoded pattern was projected on the objects is shown as Fig. 6(a). Phase distribution evaluated by the Fourier transform method is depicted as Fig. 6(b). Fringe orders were discernible with reference to the encoded stripes. Thus the phases were unwrapped without ambiguity. Figure 6(c) illustrates the distribution of the absolute phases. The profile of isolated objects was retrieved according to Eq. (2). Shown in Fig. 6(d) is the reconstructed 3D shape.

Calibration procedures can be found in the reference paper [21]. In our setup, the calibrated dimensions were approximately 20cm, 15cm, and 10cm in x-, y-, and z-axis, respectively. Errors form parameter identification in Eq. (2) was minimized as sufficient data sets were obtained. The highest order of z(φ) was n=5. Parameters in each formula, i.e. z(φ), x(z), or y(z), were determined with 20 measurements. Since the parameters were evaluated with sufficient data sets, systematic accuracy was not dominated by the calibration uncertainty. It was mainly determined by precision of the phase-extraction algorithm, sampling density of the CCD camera, and pixel resolution of the LCD projector. A standard flat plate with roughness of 3µm and size of 150mm×130mm was used to evaluate the systematic accuracy. The reconstructed shape is shown in Fig. 7(a) and (b). Depth accuracy was approximately 10µm. However, the sampling resolution was approximately 200µm, which was limited by the pixel numbers of the CCD camera.

 figure: Fig. 7.

Fig. 7. 3D shape sensing for a standard flat plate. (a) Retrieved shape. (b) One-line surface plot of the retrieved shape.

Download Full Size | PDF

The proposed encoding scheme can be used as well to inspect surfaces with large depth discontinuities. Figure 8(a) shows appearance of a spoon projected with area-encoded fringes. Step heights appeared on the edge area. With the encoded pattern, the local fringe orders were identified without ambiguity. The reconstructed profile is illustrated as Fig. 8(b).

 figure: Fig. 8.

Fig. 8. Profile measurements for surfaces with large depth discontinuities: (a) appearance of projected fringes on the tested sample, and (b) reconstructed 3D shape by the proposed measurement scheme.

Download Full Size | PDF

7. Conclusion

We have presented an area-encoded scheme for projected fringe profilometry to retrieve the 3D shape of spatially isolated objects. Only one measurement frame is required. Thus, it is desirable to inspect dynamic objects. The sinusoidal fringes are encoded with colors and binary stripes. Since the width of the colorful grids is much larger than that of the binary stripes, the binary stripes play a main role to identify the local fringe order. Therefore, this encoding scheme is not sensitive to noises from colors.

Acknowledgements

This work was performed under the support of the Aim for the Top University Plan. The authors are also grateful for support from the Asia-pacific Ocean Research Center.

References and links

1. M. Takeda and K. Mutoh, “Fourier transform profilometry for the automatic measurement of 3-D object shaped,” Appl. Opt. 22, 3977–3982 (1983). [CrossRef]   [PubMed]  

2. V. Srinivasan, H. C. Liu, and M. Halioua, “Automated phase-measuring profilometry of 3-D diffuse objects,” Appl. Opt. 23, 3105–3108 (1984). [CrossRef]   [PubMed]  

3. F. Chen, G. M. Brown, and M. Song, “Overview of three-dimensional shape measurement using optical methods,” Opt. Eng. 39, 10–22 (2000). [CrossRef]  

4. T. R. Judge and P. J. Bryanston-Cross, “Review of phase unwrapping techniques in fringe analysis,” Opt. Lasers Eng. 21, 199–239 (1994). [CrossRef]  

5. X. Su and W. Chen, “Reliability-guided phase unwrapping algorithm: a review,” Opt. Lasers Eng. 42, 245–261 (2004). [CrossRef]  

6. J. M. Huntley and H. O. Saldner, “Temporal phase-unwrapping algorithm for automated inteferogram analysis,” Appl. Opt. 32, 3047–3052 (1993). [CrossRef]   [PubMed]  

7. D. R. Burton and M. J. Lalor, “Multichannel Fourier fringe analysis as an aid to automatic phase unwrapping,” Appl. Opt. 33, 2939–2948 (1994). [CrossRef]   [PubMed]  

8. H. Zhao, W. Chen, and Y. Tan, “Phase-unwrapping algorithm for the measurement of three-dimensional object shapes,” Appl. Opt. 33, 4497–4500 (1994). [CrossRef]   [PubMed]  

9. W. Nadeborn, P. Andrä, and W. Osten, “A robust procedure for absolute phase measurement,” Opt. Lasers Eng. 24, 245–260 (1996). [CrossRef]  

10. H. O. Saldner and J. M. Huntley, “Temporal phase-unwrapping: application to surface profiling of discontinuous objects,” Appl. Opt. 36, 2770–2775 (1997). [CrossRef]   [PubMed]  

11. Y. Hao, Y. Zhao, and D. Li, “Multifrequency grating projection profilometry based on the nonlinear excess fraction method,” Appl. Opt. 38, 4106–4110 (1999). [CrossRef]  

12. E. B. Li, X. Peng, J. Xi, J. F. Chicharo, J. Q. Yao, and D.W. Zhang, “Multi-frequency and multiple phase-shift sinusoidal fringe projection for 3D profilometry,” Opt. Express 13, 1561–1569 (2005). [CrossRef]   [PubMed]  

13. M. Takeda, Q. Gu, M. Kinoshita, H. Takai, and Y. Takahashi, “Frequency-multiplex Fourier-transform profilomery: a single-shot three-dimensional shape measurement of objects with large height discontinuities and/or surface isolations,” Appl. Opt. 36, 5347–5354 (1997). [CrossRef]   [PubMed]  

14. W. H. Su and H. Liu, “Calibration-based two frequency projected fringe profilometry: a robust, accurate, and single-shot meaurement for objects with large depth discontinuities,” Opt. Express 14, 9178–9187 (2006). [CrossRef]   [PubMed]  

15. P. Vuylsteke and A. Oosterkinck, “Range image acquisition with a single binary-encoded light pattern,” IEEE Trans. Pattern Anal. Mach. Intell. 12, 148–164 (1990). [CrossRef]  

16. W. Liu, Z. Wang, G. Mu, and Z. Fang, “Color-coded projection grating method for shape measurement with a single exposure,” Appl. Opt. 39, 3504–3508 (2000). [CrossRef]  

17. S. Y. Chen and Y. F. Li, “Self-recalibration of a colour-encoded light system for automated three-dimensional measurements” Meas. Sci. Technol. 14, 33–40 (2003). [CrossRef]  

18. W. H. Su, “Color-encoded fringe projection for 3D shape measurements,” Opt. Express 15, 13167–13181 (2007). [CrossRef]   [PubMed]  

19. W. S. Zhou and X. Y. Su, “A direct mapping algorithm for phase measuring profilometry,” J. Mod. Opt. 41, 89–94 (1994). [CrossRef]  

20. Q. Hu, P. S. Huang, Q. Fu, and F. P. Chiang, “Calibration of a three-dimensional shape measurement system,” Opt. Eng. 42, 487–493 (2003). [CrossRef]  

21. H. Liu, W. H. Su, K. R., and S. Yin, “Calibration-based phase-shifting projected fringe profilometry for accurate absolute 3D surface profile measurement,” Opt. Commun. 216, 65–80 (2003). [CrossRef]  

22. B. A. Rajoub, M. J. Lalor, D. R. Burton, and S. A. Karout, “A new model for meaureing object shape using non-collimated frnge-pattern projections,” J. Opt. A: Pure Appl. Opt. 9, S66–S75 (2007). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1.
Fig. 1. Appearance of a transmittance-encoded fringe pattern.
Fig. 2.
Fig. 2. Color arrangement of the encoding scheme.
Fig. 3.
Fig. 3. Sinusoidal fringes encoded with binary stripes.
Fig. 4.
Fig. 4. Optical configuration for 3D shape measurements.
Fig. 5.
Fig. 5. Appearance of area encoded sinusoidal fringes.
Fig. 6.
Fig. 6. 3D sensing to spatially isolated objects: (a) appearance of projected fringes on the tested samples; (b) phase-extraction by Fourier transform method; (c) unwrapped phase identified by the proposed scheme; and (d) the reconstructed 3D shape.
Fig. 7.
Fig. 7. 3D shape sensing for a standard flat plate. (a) Retrieved shape. (b) One-line surface plot of the retrieved shape.
Fig. 8.
Fig. 8. Profile measurements for surfaces with large depth discontinuities: (a) appearance of projected fringes on the tested sample, and (b) reconstructed 3D shape by the proposed measurement scheme.

Equations (2)

Equations on this page are rendered with MathJax. Learn more.

t ( x ) = C ( x ) · B ( x ) · [ 0.4 + 0.4 cos ( 2 π d x ) ] + 0.2 ,
{ z = n = 0 N c n φ n x = a 1 z + a 0 y = b 1 z + b 0 ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.