Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Superfast high-resolution absolute 3D recovery of a stabilized flapping flight process

Open Access Open Access

Abstract

Scientific research of a stabilized flapping flight process (e.g. hovering) has been of great interest to a variety of fields including biology, aerodynamics, and bio-inspired robotics. Different from the current passive photogrammetry based methods, the digital fringe projection (DFP) technique has the capability of performing dense superfast (e.g. kHz) 3D topological reconstructions with the projection of defocused binary patterns, yet it is still a challenge to measure a flapping flight process with the presence of rapid flapping wings. This paper presents a novel absolute 3D reconstruction method for a stabilized flapping flight process. Essentially, the slow motion parts (e.g. body) and the fast-motion parts (e.g. wings) are segmented and separately reconstructed with phase shifting techniques and the Fourier transform, respectively. The topological relations between the wings and the body are utilized to ensure absolute 3D reconstruction. Experiments demonstrate the success of our computational framework by testing a flapping wing robot at different flapping speeds.

© 2017 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

A stabilized flapping flight process (e.g. hovering) happening in a biological scene has been an excitement for a variety of fields including but not limited to biomechanics, aerodynamics and robotics. Performing dynamic 3D topological reconstruction for a flapping flight process has great potential to facilitate better understanding of the dynamics and mechanics of the flapping flight process [1]. However, performing 3D reconstruction of a flapping flight process is challenging mainly because of the highly dynamic motion of the flapping wings.

Over the decades, scientists have been investigating different approaches for faster and more accurate 3D topological reconstruction. Among which, the photogrammetry based technology [2] has been widely adopted for flapping flight studies. To compute 3D coordinates, fiducial markers are commonly arranged on joints or boundaries of the wings to provide corresponding pairs for triangulation in different camera perspectives, and the topological reconstruction is typically performed through geometric modeling methods [2,3] from the sparsely measured marker points. Such technology has been proven successful in numerous insect flight studies [4–8]. However, the reconstructed 3D topology has a limited spatial resolution since only those marker points are precisely measured, which makes the potential subsequent analysis (e.g. wing mechanics analysis) become a challenging task.

Unlike a passive technique such as photogrammetry, the digital fringe projection (DFP) technique illuminates pre-designed patterns onto the scene to actively provide features for correspondence identification. The technology which projects defocused binary patterns [9] has the capability of conducting dense, high-quality and superfast (e.g. kHz) 3D topological reconstruction, and thus could be a potential solution to performing high quality 3D recording of a dynamic flapping flight process.

A DFP based technique typically performs phase-based analysis, which has advantages compared to intensity-based technique since phase information tends to be insensitive to noise and has higher measurement resolutions. Within phase-based analysis, the Fourier transform profilometry has apparent speed advantages for dynamic measurements with single-shot phase extraction. Later, a modified Fourier transform profilometry (MFTP) [10] was introduced to improve the phase quality by adding a π-shifted fringe. Generally speaking, to fully realize the speed advantage of FTP or MFTP, a spatial phase unwrapping [11,12] is typically accompanied to remove the 2π discontinuities on their extracted wrapped phase maps. However, a spatial phase unwrapping locally searches for 2π discontinuities on the phase map itself, which only produces a phase map that is relative to the starting point on a single connected component, which is not suitable for absolute 3D reconstruction. To find absolute phase for Fourier-based methods, different techniques were developed including embedding markers [13–15] or shifting the π-shifted fringe into a low frequency [16]. The former method could encounter problems when the embedded markers are not clear on the measured sample, while the robustness of the latter method can be affected given that FTP typically does not produce high quality phase with low frequency fringes.

Apart from FTP or MFTP, the phase shifting technique [17–19] also enjoys popularity within DFP. The phase shifting technique typically produces a higher measurement quality than FTP or MFTP owing to the increased number of fringe images, and thus is more suitable for static or quasi-static measurements. Similar to FTP or MFTP, a phase shifting technique also extracts a wrapped phase map, and thus phase unwrapping is necessary to produce continuous phase map. For absolute 3D reconstruction, temporal phase unwrapping is typically employed which adds additional fringe images to provide cues for absolute phase retrieval. Popular techniques include multi-wavelength methods [20–23], intensity-coding methods [24,25] and phase-coding methods [26, 27]. All these techniques can produce high measurement quality under static or quasi-static scenes. However, it is not well suited for dynamic measurements since additional fringe images will further sacrifice the measurement speed. Although An et al [28] introduced a temporal phase unwrapping method without additional images, such method only functions well when the measured object does not exhibit depth range larger than 2π in phase domain.

For a stabilized flapping flight process, a unique characteristic is that the imaged scene contains both quasi-static (e.g. body) and rapid-motion (e.g. wings) components. With such priori knowledge, it is desirable to incorporate the merits of different technologies to achieve the optimum measurement quality. Some existing techniques [29–34] were aiming at developing generic solutions to alleviate motion induced problems in dynamic measurement conditions. Achieving such generality, however, may not always be necessary especially when the measurement is targeting at a specific case. For instance, the difference between the body and the wings in their nature of motion provides possibilities for further algorithm optimization.

In this research, we propose a specialized absolute 3D shape measurement technology for a stabilized flapping flight process. First of all, the quasi-static (e.g. body) and the rapid-motion (e.g. wings) components are segmented using image processing tools; then, the absolute phase of the quasi-static body part is reconstructed through a four-step phase shifting technique plus a temporal phase unwrapping method [23]; next, the relative phase maps of the rapid flapping wings are extracted by MFTP plus a spatial phase unwrapping method [35]. Finally, with the known absolute phase map of the body, the geometric relations between the body and wings are employed to shift the relative phase maps of the wings to absolute ones, which subsequently results in a complete final absolute phase map after merging all the phases. Experiments will demonstrate the success of our computational framework by measuring a robotic bird under different flapping speeds. Results show that under an image acquisition speed of 5,000 Hz, we can successfully perform high quality topological 3D reconstruction of a flapping flight process up to 21 cycles/second.

2. Principle

In this section, we will introduce the related theoretical background as well as our proposed computational framework to perform superfast 3D imaging of a rapidly flying robot. We will first introduce two different fringe analysis techniques within the DFP technique: modified Fourier Transform profilometry (MFTP) [10] and four-step phase shifting method [36]. We will then introduce our proposed computational framework which separately applies the two different techniques to the measurements of different parts of the flapping-wing robot.

2.1. Modified Fourier transform profilometry (MFTP)

In a MFTP method [10], two fringe patterns with a phase shift of π are required:

I1=I(x,y)+I(x,y)cos[φ(x,y)],
I2=I(x,y)+I(x,y)cos[φ(x,y)],
where I′(x, y) represents the DC component or average intensity which looks like a snapshot of the sample object:
I(x,y)=I1+I22.
I″(x, y) is the intensity modulation, and ϕ is the phase map to be extracted. From Eq. (1)(2), one can first of all eliminate the DC component by doing image subtraction
I=(I1I2)/2=I(x,y)cos[φ(x,y)].
Following the Euler’s formula, Eq. (4) can be rewritten in a form of summing two harmonic conjugate components:
I=I(x,y)2[ejφ(x,y)+ejφ(x,y)].
Furthermore, one can apply a band-pass filter (e.g. a Hanning window [10]) to preserve only one of the two harmonic conjugate components. The resultant image If can be expressed as:
If(x,y)=I(x,y)2ejφ(x,y).
Finally, the phase can be extracted by
φ(x,y)=tan1{Im[If(x,y)]Re[If(x,y)]},
where Im and Re represent the imaginary and real part of a complex number, respectively.

2.2. Four-step phase shifting profilometry

Besides MFTP, we also used a four-step phase shifting method [36] in this research. Apart from the two fringes expressed in Eq. (1)(2), two additional π-shifted fringes are required which can be expressed as

I3=I(x,y)+I(x,y)sin[φ(x,y)],
I4=I(x,y)I(x,y)sin[φ(x,y)],
Simultaneously solving Eq. (1)(2) and Eq. (8)(9), we can extract the phase ϕ(x, y) with the following equation:
φ=tan1(I3I4I1I2).
As one may notice, the phase maps extracted by both Eq. (7) and Eq. (10) are expressed in arctangent functions, meaning that both MFTP and phase shifting methods can only produce wrapped phase maps ranging from −π to π. To remove the 2π discontinuities and produce a continuous phase map, a spatial or temporal phase unwrapping method is necessary which essentially adds integer k (x, y) multiples of 2π on the points with phase discontinuities:
Φ(x,y)=φ(x,y)+k(x,y)×2π.
The integer number k(x, y) is typically referred to as the fringe order.

As aforementioned, in general, the resultant unwrapped phase maps from spatial phase unwrapping are relative to a point on a single connected component, and thus is difficult to be used for absolute phase retrieval. However, a temporal phase unwrapping method typically requires additional fringe projection to provide cues to determine the absolute fringe order, making it challenging for measuring flapping wings which commonly presents perceivable motion between a sequence of fringes even if a high-speed (e.g. kHz) fringe projection and image acquisition is performed. However, consider in a stabilized flapping flight, the body part can be regarded as quasi-static given that no vibrant body locomotion is present. In this case, temporal phase unwrapping should still function well enough to obtain high quality absolute phase. Also, given that the wings and the body are constrained geometrically, the known absolute phase of the body could assist us in obtaining absolute phase for the fast flapping wings. The next section will elaborate our proposed hybrid computational framework based on the aforementioned assumptions.

2.3. Proposed hybrid computational framework for absolute phase retrieval

Figure 1 illustrates a schematic diagram of our proposed computational framework. In our proposed method, we used a total of 8 fringes for absolute phase computation, including a set of high-frequency (i.e. fringe period TH = 24 pixels) four-step phase shifted binary area modulated patterns [37] (denote as I1I4), and a set of low-frequency (i.e. fringe period TL = 380 pixels) four-step phase shifted binary dithered patterns [38] (denote as I5I8). The first step is to obtain the DC component [see Eq. (3)] to assist the segmentation of the body (with slow motion) and the wings (with fast motion). The details of segmentation will be explained in Step 1 of Sec. 2.4. For the body part, we used the standard four-step phase shifting with an enhanced two-wavelength temporal phase unwrapping method [23] to obtain absolute phase, using all 8 patterns. For each segmented wing, we used MFTP method with spatial phase unwrapping to get continuous relative phase maps. To shift the relative phase maps of the wings to absolute sense, we inspected the known absolute phase of the body to get insights for determining absolute fringe order for the wings (details to be explained in Step 4 of Sec. 2.4). Finally, the absolute phase map of the entire robotic bird with flapping flight is reconstructed by merging the absolute phase maps of both the body and the wings. In the next section, we will elaborate the details step-by-step for better understanding of our proposed computational framework.

 figure: Fig. 1

Fig. 1 Proposed hybrid computational framework; the image with pure white fringe projection is used to separate the body and the wings; the relative phase maps of the wings were extracted using MFTP method [10] and spatial phase unwrapping [35]; the absolute phase of the body were extracted by phase shifting and enhanced two-wavelength phase unwrapping method [23]; the relative phase maps of the wings are shifted to absolute ones using geometric relations between the body and the wings; the final absolute phase map is produced by merging all absolute phase maps.

Download Full Size | PDF

2.4. Procedures

The entire computational framework is composed of the following major steps:

  • Step 1: Segmentation of the body and the wings. We first obtained the DC component of the robotic bird using Eq. (3). Then, for the first frame, we manually extracted the body segment shown in Fig. 2(a) as a template image. For our next frame, we recognized its body part by performing template matching using MatLab vision.TemplateMatcher toolbox which inherently adopts a matching metric of the sum of absolute difference. The matched body part is shown in Fig. 2(b), on which the extracted body part will then be used to identify the body part for the next frame. Once the body part is recognized, we can create a mask image Mbody for the body part, as shown in Fig. 2(c). The mask images for the wings Mwings are then generated by subtracting the body part from the overall image, as shown in Fig. 2(d).
  • Step 2: Absolute phase retrieval for the body. After segmenting the body and the wings, we first adopt an enhanced two-wavelength phase shifting method for absolute phase retrieval [23] of the body. First, we extract the wrapped phases φbodyL and φbodyH respectively from both the low-frequency (i.e. I5I8, TL = 380) and high-frequency (i.e. I1I4, TH = 24) fringes using four-step phase shifting (see Sec. 2.2). A sample fringe (with body mask Mbody applied) of both low- and high-frequency fringes are shown in Fig. 3(a) and 3(b), respectively; and the resultant wrapped phase maps φbodyL and φbodyH are shown in Fig. 3(c) – 3(d). It is worth to note that all figures in Fig. 3 are cropped for better visualization purpose. Then, the low-frequency absolute phase ΦbodyL shown in Fig. 3(f) is obtained by referring to an artificial absolute phase plane Φmin shown in Fig. 3(e) (see Reference [23] for details). The low-frequency absolute phase ΦbodyL can be computed as
    kL=ceil(ΦminφbodyL2π),
    ΦbodyL=φbodyL+kL×2π.
    Then the high-frequency absolute phase ΦbodyH is computed by
    kbody=round(ΦbodyL×TLTHφbodyH2π),
    ΦbodyH=φbodyH+kbody×2π
    Here, the ceil () operator rounds up a number to the next larger integer, and the round() operator rounds up a number its closest integer. The extracted absolute phase map ΦbodyH for the body is shown in Fig. 3(g).

    It is worth to mention that we did not directly use MFTP for the extraction of body. The reason is that the body part, as shown in Fig. 4(a), contains dramatic local textural or geometric variations, which could deteriorate the phase quality of the MFTP method. Figure 4(b) – 4(c) demonstrate the reconstructed 3D profiles of the body part respectively using the absolute phases obtained from MFTP and four-step phase shifting methods. From visual comparison, one can clearly see that four-step phase shifting presents higher measurement quality under this scenario. Therefore, we use four-step phase shifting for body extraction in this research.

  • Step 3: Relative phase retrieval for the wings. For the extracted wings, we adopted the MFTP method (see Sec. 2.1) using I1 and I2 to obtain a wrapped phase map for each wing. A sample fringe with each wing’s mask applied is shown in Fig. 5(a) and 5(d) and the resultant wrapped phase maps are shown in Fig. 5(b) and 5(e). Then, we adopted a multi-level quality guided spatial phase unwrapping method [35] to obtain continuous relative phase maps ΦwingsR for each wing, as shown in Fig. 5(c) – 5(f).
  • Step 4: Absolute phase retrieval for the wings by referring to the body’s absolute phase. As aforementioned, the nature of spatial phase unwrapping is that it unwraps phase starting from a point on a single connected component. Therefore, the unwrapped phase ΦwingsR for each wing from previous step is relative to a single point on that wing, which has an integer kshift multiples of 2π shift from its absolute phase. Figure 6(a) – 6(c) respectively show the body’s absolute phase and the wings’ relative phases. If we pick the same cross-section (red) on the three phase maps and plot them all on Fig. 6(d), we can see that there exists abrupt phase jumps between the boundaries of the wings and the body, which is a result of the rigid shift of kshift × 2π. However, since the phase of the body is absolute, it gives us additional information to find out the rigid shift kshift in fringe order. We first introduce a reference phase line y = f(x) that can be expressed as:
    yref=a×x+b.
    Here, the slope a and the y-intercept b can be found by performing a linear fitting of the body’s phase line. The extracted reference phase line is shown in Fig. 6(d) as a pink dash-dotted line. Once we have extracted the reference phase line, we pick the phase values ywings of the nearest 20 boundary pixels of the wings to locate the fringe order shift kshift as
    kshift=mode[round(yrefywings2π)].
    Here, the mode[] operator finds the most common number. Once the shift kshift is determined, we can compute the absolute phase map ΦwingsA for the wings by shifting the relative phase map ΦwingsR:
    ΦwingsA=ΦwingsR+kshift×2π.
    Figure. 6(e) and 6(f) respectively show a cross-section and the complete absolute phase map of the wings ΦwingsA after compensating the rigid shift.
  • Step 5: Generation of final absolute phase map. Once the absolute phase maps ΦbodyH and ΦwingsA are obtained for both the body and the wings, we can merge into a final absolute phase map by applying the mask images:
    Φfinal=ΦbodyH×Mbody+ΦwingsA×Mwings.
    Figure 7(a) shows the final absolute phase map. To demonstrate that the finally computed phase map is indeed absolute phase map, we plot a phase cross-section in Fig. 7(b) and overlay it with the same phase cross-section obtained from the conventional temporal phase unwrapping (i.e. enhanced two-wavelength method [23]), from which we can see that overall they agree quite well with each other. To visualize their differences, we subtracted the two curves and plotted the difference in Fig. 7(c). On this difference curve, we can see that on the body part (middle), the two methods have identical results since for body phase extraction, our proposed method does exactly the same way as the conventional temporal phase unwrapping method does. However, we can indeed see apparent difference on the wings (mean difference: 0.27 rad; RMS difference: 1.34 rad). The reason is that the conventional temporal phase unwrapping method introduces measurement errors caused by the wings’ motion, albeit the mean difference is still relatively small. To clearly visualize this effect, we performed calibration based 3D reconstruction [39] using both methods and the results are shown in Fig. 7(d) – 7(e). The results and associated video ( Visualization 1) clearly demonstrate that our method consistently produces high measurement quality, yet the conventional temporal phase unwrapping method creates errors and motion artifacts on the surface and on some boundary points of the wings, as shown in the highlighted areas in Fig. 7(e). In this preliminary testing, we set the flapping speed at a relatively slow speed of 7 cycles/second. In the next section, we will demonstrate more thorough evaluations by showing the dynamic experimental results under other different flapping speeds of the wings, including 12 cycles/second (moderate) and 21 cycles/second (high).

 figure: Fig. 2

Fig. 2 Segmentation of the body and the wings using template matching. (a) The template image manually extracted in the initial frame; (b) the matched body part on the next frame; (c) generated mask image for the body; (d) generated mask image for the wings by subtracting body from overall image.

Download Full Size | PDF

 figure: Fig. 3

Fig. 3 Body absolute phase retrieval using enhanced two-wavelength phase-shifting. (a) a sample low frequency fringe image; (b) a sample high frequency fringe image; (c) resultant low frequency wrapped phase maps ϕbodyL; (d) resultant high frequency wrapped phase maps ϕbodyH; (e) an artificial phase plane Φmin; (f) unwrapped phase ΦbodyL of (c) by pixel-by-pixel referring to (e); (g) final absolute phase map ΦbodyH for the body.

Download Full Size | PDF

 figure: Fig. 4

Fig. 4 Comparison of extracted 3D of the body part. (a) The texture image; (b) reconstructed 3D result using the absolute phases extracted from MFTP; (c) reconstructed 3D result using the absolute phases extracted from four-step phase-shifting.

Download Full Size | PDF

 figure: Fig. 5

Fig. 5 Relative phase extraction for the wings. (a) fringe image I1 with left wing mask applied; (b) wrapped phase map for the left wing using MFTP; (c) unwrapped relative phase map of (b) using spatial phase unwrapping; (d) – (f) corresponding fringes and phase maps for the right wing.

Download Full Size | PDF

 figure: Fig. 6

Fig. 6 Using geometric relations of the wings and the body to shift the relative phase maps of the wings to absolute ones. (a) – (c) The same cross-section picked for the body’s absolute phase and the wings’ relative phase; (d) the plotted cross-sections before shifting; (e) the plotted cross-sections after shifting; (f) the resultant wings’ absolute phase map ΦwingsA.

Download Full Size | PDF

 figure: Fig. 7

Fig. 7 Final absolute phase computation and 3D reconstruction. (a) Computed absolute phase map by merging Fig. 6(a) and 6(f); (b) comparing proposed method with conventional temporal phase unwrapping (enhanced two-wavelength method [23]) by plotting a cross-section; (c) the difference of the cross-section in (b) (mean difference: 0.27 rad; RMS difference: 1.34 rad); (d) – (e) reconstructed 3D geometry using proposed method and conventional temporal phase unwrapping (associated video Visualization 1).

Download Full Size | PDF

3. Experiments

We set up a superfast DFP-based 3D shape measurement platform to validate our proposed computational framework. A Wintech PRO 6500 DLP projector is used for high-speed fringe projection, and a high-speed CMOS camera (Model: NAC MEMRECAM GX-8F) is used for image capture. The image acquisition is properly synchronized with pattern projection at a speed of 5,000 Hz. The image resolutions for the projector and the camera are set as 1, 920 × 1, 080 pixels and 800 × 600 pixels, respectively. The camera lens (Model: SIGMA 24 mm f/1.8 EX DG) has a 24 mm focal length with an aperture ranges from f/1.8 to f/22. The sample robotic bird (Model: XTIM Bionic Bird Avitron V2.0) that we used has a maximum beat frequency around 25 cycles/second. A single wing span is around 150 mm (L) × 70 mm (W).

We first set the robotic bird at a moderate flapping speed around 12 cycles/second, and measured its dynamic flapping flight process using both proposed method and conventional enhanced two-wavelength method [23]. We show the results of a sample down-stroke and up-stroke frame in Fig. 8 and the dynamic 3D results in its associated video ( Visualization 2). From the results, we can clearly see that similar as the previous result, the conventional enhanced two-wavelength method exhibits apparent errors and artifacts on the wings. Yet our method consistently provides higher measurement quality, which indicates the success of our method.

 figure: Fig. 8

Fig. 8 Sample results of robotic bird measurement with a flapping speed of 12 cycles/second (associated video Visualization 2). (a) A sample downstroke 2D image; (b) reconstructed 3D geometry of (a) using proposed method; (b) reconstructed 3D geometry of (a) using conventional enhanced two-wavelength method [23]; (e) – (g) corresponding 2D image and 3D results of a sample upstroke frame.

Download Full Size | PDF

For further validation, we performed similar experiments by changing the flapping speed up to 21 cycles/second. Figure 9 and its associated video ( Visualization 3) show the measurement results, from which we can see that the our method again consistently works well, yet the conventional enhanced two-wavelength method still suffers quite a lot from motion induced errors and artifacts. This series of experiments with different flapping speeds have clearly demonstrated the success of our computational framework.

 figure: Fig. 9

Fig. 9 Sample results of robotic bird measurement with a flapping speed of 21 cycles/second (associated video Visualization 3). (a) A sample downstroke 2D image; (b) reconstructed 3D geometry of (a) using proposed method; (b) reconstructed 3D geometry of (a) using conventional enhanced two-wavelength method [23]; (e) – (g) corresponding 2D image and 3D results of a sample upstroke frame.

Download Full Size | PDF

4. Discussion

For the measurement of a stabilized flapping flight process, our method has the following advantages compared to existing absolute 3D reconstruction methods:

  • Marker-less measurements with high spatial resolutions. Compared to existing photogrammetry based methods, our technology does not require fuidicial markers for topological reconstruction, and can provide high resolution 3D recovery of a flapping flight process with high measurement quality.
  • Resistance to errors caused by wings’ motion. The absolute phase of the wings were separately retrieved by shifting the spatially unwrapped phase from MFTP method. Therefore, the phase obtained from MFTP ultimately determines the quality of reconstructed 3D geometry for the wings. Since MFTP method only requires two fringe images, it is resistant to errors caused by the wings’ motion.
  • Combined advantages of Fourier transform and phase-shifting. Our method separately performs 3D reconstruction for the body and the wings, where the phase quality advantage of the phase-shifting method was beneficial to the reconstruction of quasi-static body, yet the speed advantage of Fourier transform method was more appropriate for the reconstruction of rapid-moving wings.

Our research presented in this article mainly focuses on measuring a stabilized flapping flight process. It has great potential for analysis of a similar phenomenon in insect flight study such as hovering. However, our technology could encounter challenges when the flapping flight scene is not stabilized or contains vibrant out-of-plane body locomotion, since our body-wing segmentation is based on template matching. Future investigations on developing a more robust body tracking and segmentation methods are necessary to employ such algorithm in a more generic flapping flight process.

5. Summary

In this research, we presented a specialized absolute 3D shape measurement method for a stabilized flapping flight process. We first segmented the body with slow motion and the wings with fast motion. Then, by taking advantage of the phase quality merit of phase-shifting and the speed advantage of Fourier transform, we separately performed phase extraction using the former method for the body and the latter method for the wings. The former method computes absolute phase map for the body using conventional temporal phase unwrapping, yet the latter method produces relative phase maps for the wings with spatial phase unwrapping. Finally, the absolute phase map of the wings are computed by referring to the known extracted absolute phase map of the body. Under a superfast 3D shape measurement system with an image acquisition speed of 5,000 Hz, we demonstrated the success of our method by measuring a robotic bird at different flapping speeds. Such technology can drastically increase the resolutions of topological reconstruction compared to current passive photogrammetry based methods.

Funding

National Science Foundation (NSF) Directorate for Engineering (ENG) (CMMI-1531048).

Acknowledgments

The authors would like to thank the members in XYZT laboratory at Purdue University. In particular, we thank Jae-Sang Hyun for his assistance in hardware setup and data collection. This research work was carried out when Beiwen Li was with Purdue University.

References and links

1. T. L. Hedrick, S. A. Combes, and L. A. Miller, “Recent developments in the study of insect flight,” Canadian Journal of Zoology 93, 925–943 (2014). [CrossRef]  

2. S. M. Walker, A. L. Thomas, and G. K. Taylor, “Photogrammetric reconstruction of high-resolution surface topographies and deformable wing kinematics of tethered locusts and free-flying hoverflies,” J. Royal Soc. Interface 6, 351–366 (2009). [CrossRef]  

3. C. Koehler, Z. Liang, Z. Gaston, H. Wan, and H. Dong, “3d reconstruction and analysis of wing deformation in free-flying dragonflies,” J. Exp. Biol. 215, 3018–3027 (2012). [CrossRef]   [PubMed]  

4. Y. Ren, H. Dong, X. Deng, and B. Tobalske, “Turning on a dime: Asymmetric vortex formation in hummingbird maneuvering flight,” Physical Review Fluids 1, 050511 (2016). [CrossRef]  

5. B. W. Tobalske, D. R. Warrick, C. J. Clark, D. R. Powers, T. L. Hedrick, G. A. Hyder, and A. A. Biewener, “Three-dimensional kinematics of hummingbird flight,” J. Exp. Biol. 210, 2368–2382 (2007). [CrossRef]   [PubMed]  

6. A. P. Willmott and C. P. Ellington, “The mechanics of flight in the hawkmoth manduca sexta. i. kinematics of hovering and forward flight,” J. Exp. Biol. 200, 2705–2722 (1997).

7. R. J. Wootton, “Leading edge section and asymmetric twisting in the wings of flying butterflies (insecta, papilionoidea),” J. Exp. Biol. 180, 105 (1993).

8. U. M. L. Norberg and Y. Winter, “Wing beat kinematics of a nectar-feeding bat, glossophaga soricina, flying at different flight speeds and strouhal numbers,” J. Exp. Biol. 209, 3887–3897 (2006). [CrossRef]  

9. S. Zhang, D. van der Weide, and J. Oliver, “Superfast phase-shifting method for 3-d shape measurement,” Opt. Express 18, 9684–9689 (2010). [CrossRef]   [PubMed]  

10. L. Guo, X. Su, and J. Li, “Improved fourier transform profilometry for the automatic measurement of 3d object shapes,” Opt. Eng. 29, 1439–1444 (1990). [CrossRef]  

11. D. C. Ghiglia and M. D. Pritt, Two-Dimensional Phase Unwrapping: Theory, Algorithms, and Software (John Wiley and Sons, 1998).

12. X. Su and W. Chen, “Reliability-guided phase unwrapping algorithm: a review,” Opt. Laser Eng. 42, 245–261 (2004). [CrossRef]  

13. H. Guo and P. S. Huang, “Absolute phase technique for the fourier transform method,” Opt. Eng. 48, 043609 (2009). [CrossRef]  

14. Y. Xiao, X. Su, Q. Zhang, and Z. Li, “3-d profilometry for the impact process with marked fringes tracking,” Opto-Electron. Eng. 34, 46–52 (2007).

15. B. Budianto, P. Lun, and T.-C. Hsung, “Marker encoded fringe projection profilometry for efficient 3d model acquisition,” Appl. Opt. 53, 7442–7453 (2014). [CrossRef]   [PubMed]  

16. H. Yun, B. Li, and S. Zhang, “Pixel-by-pixel absolute three-dimensional shape measurement with modified Fourier transform profilometry,” Appl. Opt. 56, 1472–1480 (2017). [CrossRef]  

17. D. Malacara, Optical Shop Testing (John Wiley & Sons, 2007). [CrossRef]  

18. M. Servin, J. A. Quiroga, and J. M. Padilla, Fringe Pattern Analysis for Optical Metrology: Theory, Algorithms, and Applications (John Wiley & Sons, 2014).

19. S. Zhang, High-Speed 3D Imaging with Digital Fringe Projection Techniques (CRC Press, 2016).

20. Y.-Y. Cheng and J. C. Wyant, “Two-wavelength phase shifting interferometry,” Appl. Opt. 23, 4539–4543 (1984). [CrossRef]   [PubMed]  

21. Y.-Y. Cheng and J. C. Wyant, “Multiple-wavelength phase shifting interferometry,” Appl. Opt. 24, 804–807 (1985). [CrossRef]  

22. Y. Wang and S. Zhang, “Superfast multifrequency phase-shifting technique with optimal pulse width modulation,” Opt. Express 19, 5143–5148 (2011).

23. J.-S. Hyun and S. Zhang, “Enhanced two-frequency phase-shifting method,” Appl. Opt. 55, 4395–4401 (2016). [CrossRef]   [PubMed]  

24. J. Pan, P. S. Huang, and F.-P. Chiang, “Color-coded binary fringe projection technique for 3-d shape measurement,” Opt. Eng. 44, 023606 (2005). [CrossRef]  

25. S. Zhang, “Flexible 3d shape measurement using projector defocusing: Extended measurement range,” Opt. Lett. 35, 931–933 (2010).

26. Y. Wang and S. Zhang, “Novel phase coding method for absolute phase retrieval,” Opt. Lett. 37, 2067–2069 (2012). [CrossRef]   [PubMed]  

27. Y. Xing, C. Quan, and C. Tay, “A modified phase-coding method for absolute phase retrieval,” Opt. Lasers Eng. (2016). (in press). [CrossRef]  

28. Y. An, J.-S. Hyun, and S. Zhang, “Pixel-wise absolute phase unwrapping using geometric constraints of structured light system,” Opt. Express 24, 18445–18459 (2016). [CrossRef]   [PubMed]  

29. C. Zuo, Q. Chen, G. Gu, S. Feng, and F. Feng, “High-speed three-dimensional profilometry for multiple objects with complex shapes,” Opt. Express 20, 19493–19510 (2012). [CrossRef]   [PubMed]  

30. C. Zuo, Q. Chen, G. Gu, S. Feng, F. Feng, R. Li, and G. Shen, “High-speed three-dimensional shape measurement for dynamic scenes using bi-frequency tripolar pulse-width-modulation fringe projection,” Opt. Lasers Eng. 51, 953–960 (2013). [CrossRef]  

31. Y. Wang, S. Zhang, and J. H. Oliver, “3-d shape measurement technique for multiple rapidly moving objects,” Opt. Express 19, 5149–5155 (2011). [CrossRef]   [PubMed]  

32. P. Cong, Z. Xiong, Y. Zhang, S. Zhao, and F. Wu, “Accurate dynamic 3d sensing with Fourier-assisted phase shifting,” IEEE Journal of Selected Topics in Signal Processing 9, 396–408 (2015). [CrossRef]  

33. B. Li, Z. Liu, and S. Zhang, “Motion induced error reduction by combining Fourier transform profilometry with phase-shifting profilometry,” Opt. Express 24, 23289–23303 (2016). [CrossRef]   [PubMed]  

34. B. Li, S. Ma, and Y. Zhai, “Fast temporal phase unwrapping method for the fringe reflection technique based on the orthogonal grid fringes,” Appl. Opt. 54, 6282–6290 (2015). [CrossRef]   [PubMed]  

35. S. Zhang, X. Li, and S.-T. Yau, “Multilevel quality-guided phase unwrapping algorithm for real-time three-dimensional shape reconstruction,” Appl. Opt. 46, 50–57 (2007). [CrossRef]  

36. H. Schreiber and J.-H. Bruning, “Phase Shifting Interferometry”, in Optical Shop Testing, 3 Edition, D. Malacaral,ed. (John Wiley & Sons, 2007). [CrossRef]  

37. W. Lohry and S. Zhang, “Fourier transform profilometry using a binary area modulation technique,” Opt. Eng. 51, 113602 (2012). [CrossRef]  

38. Y. Wang and S. Zhang, “Three-dimensional shape measurement with binary dithered patterns,” Appl. Opt. 51, 6631–6636 (2012). [CrossRef]   [PubMed]  

39. B. Li, N. Karpinsky, and S. Zhang, “Novel calibration method for structured light system with an out-of-focus projector,” Appl. Opt. 53, 3415–3426 (2014). [CrossRef]   [PubMed]  

Supplementary Material (3)

NameDescription
Visualization 1       Result of dynamic 3D reconstruction of a flapping flight process when the flapping speed is set at 7 cycles/second (slow).
Visualization 2       Result of dynamic 3D reconstruction of a flapping flight process when the flapping speed is set at 12 cycles/second (moderate).
Visualization 3       Result of dynamic 3D reconstruction of a flapping flight process when the flapping speed is set at 21 cycles/second (fast).

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 Proposed hybrid computational framework; the image with pure white fringe projection is used to separate the body and the wings; the relative phase maps of the wings were extracted using MFTP method [10] and spatial phase unwrapping [35]; the absolute phase of the body were extracted by phase shifting and enhanced two-wavelength phase unwrapping method [23]; the relative phase maps of the wings are shifted to absolute ones using geometric relations between the body and the wings; the final absolute phase map is produced by merging all absolute phase maps.
Fig. 2
Fig. 2 Segmentation of the body and the wings using template matching. (a) The template image manually extracted in the initial frame; (b) the matched body part on the next frame; (c) generated mask image for the body; (d) generated mask image for the wings by subtracting body from overall image.
Fig. 3
Fig. 3 Body absolute phase retrieval using enhanced two-wavelength phase-shifting. (a) a sample low frequency fringe image; (b) a sample high frequency fringe image; (c) resultant low frequency wrapped phase maps ϕ body L; (d) resultant high frequency wrapped phase maps ϕ body H; (e) an artificial phase plane Φmin; (f) unwrapped phase Φ body L of (c) by pixel-by-pixel referring to (e); (g) final absolute phase map Φ body H for the body.
Fig. 4
Fig. 4 Comparison of extracted 3D of the body part. (a) The texture image; (b) reconstructed 3D result using the absolute phases extracted from MFTP; (c) reconstructed 3D result using the absolute phases extracted from four-step phase-shifting.
Fig. 5
Fig. 5 Relative phase extraction for the wings. (a) fringe image I1 with left wing mask applied; (b) wrapped phase map for the left wing using MFTP; (c) unwrapped relative phase map of (b) using spatial phase unwrapping; (d) – (f) corresponding fringes and phase maps for the right wing.
Fig. 6
Fig. 6 Using geometric relations of the wings and the body to shift the relative phase maps of the wings to absolute ones. (a) – (c) The same cross-section picked for the body’s absolute phase and the wings’ relative phase; (d) the plotted cross-sections before shifting; (e) the plotted cross-sections after shifting; (f) the resultant wings’ absolute phase map Φ wings A.
Fig. 7
Fig. 7 Final absolute phase computation and 3D reconstruction. (a) Computed absolute phase map by merging Fig. 6(a) and 6(f); (b) comparing proposed method with conventional temporal phase unwrapping (enhanced two-wavelength method [23]) by plotting a cross-section; (c) the difference of the cross-section in (b) (mean difference: 0.27 rad; RMS difference: 1.34 rad); (d) – (e) reconstructed 3D geometry using proposed method and conventional temporal phase unwrapping (associated video Visualization 1).
Fig. 8
Fig. 8 Sample results of robotic bird measurement with a flapping speed of 12 cycles/second (associated video Visualization 2). (a) A sample downstroke 2D image; (b) reconstructed 3D geometry of (a) using proposed method; (b) reconstructed 3D geometry of (a) using conventional enhanced two-wavelength method [23]; (e) – (g) corresponding 2D image and 3D results of a sample upstroke frame.
Fig. 9
Fig. 9 Sample results of robotic bird measurement with a flapping speed of 21 cycles/second (associated video Visualization 3). (a) A sample downstroke 2D image; (b) reconstructed 3D geometry of (a) using proposed method; (b) reconstructed 3D geometry of (a) using conventional enhanced two-wavelength method [23]; (e) – (g) corresponding 2D image and 3D results of a sample upstroke frame.

Equations (19)

Equations on this page are rendered with MathJax. Learn more.

I 1 = I ( x , y ) + I ( x , y ) cos [ φ ( x , y ) ] ,
I 2 = I ( x , y ) + I ( x , y ) cos [ φ ( x , y ) ] ,
I ( x , y ) = I 1 + I 2 2 .
I = ( I 1 I 2 ) / 2 = I ( x , y ) cos [ φ ( x , y ) ] .
I = I ( x , y ) 2 [ e j φ ( x , y ) + e j φ ( x , y ) ] .
I f ( x , y ) = I ( x , y ) 2 e j φ ( x , y ) .
φ ( x , y ) = tan 1 { Im [ I f ( x , y ) ] Re [ I f ( x , y ) ] } ,
I 3 = I ( x , y ) + I ( x , y ) sin [ φ ( x , y ) ] ,
I 4 = I ( x , y ) I ( x , y ) sin [ φ ( x , y ) ] ,
φ = tan 1 ( I 3 I 4 I 1 I 2 ) .
Φ ( x , y ) = φ ( x , y ) + k ( x , y ) × 2 π .
k L = ceil ( Φ min φ body L 2 π ) ,
Φ body L = φ body L + k L × 2 π .
k body = round ( Φ body L × T L T H φ body H 2 π ) ,
Φ body H = φ body H + k body × 2 π
y r e f = a × x + b .
k shift = mode [ round ( y ref y wings 2 π ) ] .
Φ wings A = Φ wings R + k shift × 2 π .
Φ final = Φ body H × M body + Φ wings A × M wings .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.