Abstract
Background-oriented schlieren tomography (BOST) is effective for flow field measurement; however, different from general computed tomography (CT), the BOST utilizes the deflection of rays passing through an inhomogeneous field for measurement. It is sensitive to the refractive index gradient. Therefore, an additional integration step is typically employed to obtain the refractive index. In this article, a calculation method of projection matrix is proposed based on the radial basis function (RBF). The 3D distribution of the refractive index can be reconstructed directly. This method was first verified by numerical simulation. Then, the 3D instantaneous refractive index field above a candle flame was measured. The reprojection error was calculated by ray tracing. The results illustrate the accuracy and stability of the proposed method. This research provides a new and complete solution for the 3D instantaneous flow field (refractive index, density, or temperature) measurement.
© 2022 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement
1. Introduction
With the development of aerospace and industrial design, measurement of the flow field has become increasingly important. Thus, various measurement methods have been developed in recent decades. One of these is the optical-based measurement method, which can capture the instantaneous flow field properties of the entire field. It has the advantage of being a noncontact and nonintrusive technique. Therefore, it has been widely used for the measurement of flow fields. Interferometry [1–3], shadowgraph [4,5], and schlieren [6–8] methods are among the most common approaches. Interferometry can provide quantitative results; however, it has an extremely complex setup and requires strict conditions in the surrounding environment. Although the shadowgraph method has the simplest experimental setup, it is only suitable for flow fields with large gradient variations and only provides qualitative analyses [9].
The traditional schlieren setup generally consists of a light source, lens, camera, and sharp knife edge. Since its development around the 17th century [10], the schlieren method has been adopted for the quantitative measurement of the refractive index, density, and temperature of the flow field [7,11,12]. Furthermore, schlieren tomography [13–15], in which schlieren images are obtained from multiple directions, has been developed. In 2000, the background-oriented schlieren (BOS) technique was proposed by Dalziel et al. [16] and Raffel et al. [17] at almost the same time. Compared with the traditional schlieren method, BOS has modest technical requirements, which only consist of a light source, background, and camera. The BOS technique has the advantage of a very large field of view without large, expensive, and precision optics [18]. Therefore, although BOS has some disadvantages such as vibration sensitivity and limited real-time visualization [18], it still can provide an excellent strategy for full-field measurements. And BOS can provide unique solutions to some problems [18].
The BOST method utilizes the deflection of rays passing through an inhomogeneous field for measurement. Therefore, BOST differs from general CT reconstruction. Its direct reconstruction result is the distribution of the refractive index gradient. An additional step is required to obtain the distribution of the refractive index — for example, direct line integration or solving Poisson equation.
Since the BOS method was proposed, many 2D experiments have demonstrated its applicability in the measurement of the flow field. Raffel et al. [17] first used the BOS method to measure the vortex of helicopter blades. In 2001, Richard and Raffel [19] measured a compressible vortex in the wake of a cylinder. They compared the results of the BOS technique and particle image velocimetry, and similar results were obtained. The separation of the boundary layer and shear layer, as well as vortex shedding, was clearly identified in both results. Direct line integration was used to calculate the density distribution in both studies. Richard and Raffel et al. [20] applied the BOS method to a supersonic jet. These experiments prove that the BOS method can be used for flow field measurements. However, these are 2D BOS measurements and cannot be used to obtain the 3D structure of a complex flow field.
If the flow is axisymmetric, the 3D flow can be reconstructed by inverse Abel transformation from one viewing direction, which is perpendicular to the axis of symmetry of the flow. For example, Sourgen et al. [21] compared numerical simulations and BOS results using the inverse Abel transformation. In 2014, van Hinsberg and Rösgen [9] measured an under-expanded supersonic free jet by using the inverse Abel transformation. However, the Abel transformation requires the assumption of axisymmetric flow, and it is susceptible to noise. Therefore, the filtered backprojection (FBP) algorithm was employed for the reconstruction of the BOS. In 2004, Venkatakrishnan and Meier [22] measured the density field for an axisymmetric supersonic flow over a cone cylinder model. In 2005, Venkatakrishnan [23] measured an axisymmetric under-expanded jet. The density field was obtained by solving the Poisson equation and using the FBP algorithm [22,23]. In addition, the FBP algorithm can be more universally used for nonsymmetric flows.
For a non-axisymmetric flow, BOS projections should be obtained from multiple directions for BOST reconstruction. When a stationary flow is considered, the BOS projections can be obtained by rotating the position of the camera and background relative to the flow [24], in which the multidirectional projections are obtained from 19 projection angles from 0° to 90° at 5° intervals. When the projection angle is limited or there are few projection data, the FBP algorithm is prone to artifacts [25]. Experiments performed by Ota et al. [26–28] used the same rotating strategy as the previously mentioned study [24], while the algebraic reconstruction technique (ART) was employed for the reconstruction. ART is more practical for reconstruction with limited data than the FBP algorithm. The density field was reconstructed from 30 projections by Ota et al. [29]. However, for the measurement of complex flow fields, this rotation strategy is only an expedient.
When a complex flow is considered, a synchronous multicamera setup is employed to record the BOS projections from different directions. The first multicamera BOS measurement of nonstationary flow was demonstrated by Atcheson et al. [30] in 2008. There were 16 cameras surrounding the hot air flow above a gas burner in a 180° arc, and the refractive index field was calculated by Poisson integration. Zeb et al. [31] employed three cameras to measure the convectional flow field. However, the reconstruction results had strong artifacts because of the small number of cameras. In 2015, Nicolas et al. [32] designed a 3D BOS experimental facility with 12 noncoplanar cameras and measured three different hot air flows. A multicamera BOS approach was also used in the measurement of compressible flow [33] and a supersonic wind tunnel [34]. Since 2018, 3D BOS has also been used in the measurement of the flame field [35–38].
The deviation angle of the BOS is proportional to the integral of the refractive index gradient; thus, the direct reconstruction result obtains the distribution of the refractive index gradient. Then, an integration step is performed to calculate the refractive index field. The first method is direct line integration [17,19,25] which is a simple method, while this method produces line noise [19]. It is easy to produce noise accumulation during integration. The second extensively used method is to solve the Poisson equation [17,20,22,23,27,30,39]. However, this method is complicated and has the disadvantage of strongly smoothing the results [17]. The key to obtaining the refractive index distribution directly is to address the derivation process in the discretization. In 2015, Nicolas et al. [32] proposed a direct reconstruction method. To eliminate the integration step, they combined a finite-difference matrix (FDM) in the discretization process. Recently, this method has been adopted in many studies [35,37,38,40]. It is an excellent method; however, there are many kinds of differences, such as forward, backward, central, and Richardson differences. The order of the differences also influences the accuracy.
In this article, a calculation method of projection matrix is proposed based on the RBF for BOST. Unlike the “mechanistic” approach of Nicholas et al. [32], the projection matrix is calculated using the differentiable property of RBF. Thus, the weights in the projection matrix are the integration of partial derivatives of basis functions. The integration step is eliminated, and no additional finite-difference matrix is used. By using the 3D RBF, the derivation process can be combined with the calculation of the projection matrix. In Section 2, the BOS theory is described. In Section 3, the BOS projection model based on the 3D RBF is presented. Section 4 presents the results of the numerical simulation and the experimental reconstruction. Finally, conclusions are given in Section 5.
2. BOS theory
The BOS theory can be found in many articles, but, for the convenience of readers, it is briefly introduced here. The flow field is reconstructed by the BOS method through the deviation of the ray when propagating in an inhomogeneous field. A schematic diagram of the BOS theory is illustrated in Fig. 1. The background pattern is distorted when there is an inhomogeneous field between the background and the camera. The yellow and green lines are the ray paths with and without flow, respectively. In addition, $\Delta u$ and $\Delta v$ are the displacements of the ray in the imaging plane when there is a flow, ${l_A}$, ${l_B}$, and ${l_C}$ are the distance from the background to the probe volume, distance from the probe volume to the lens, and distance from the lens to the imaging plane (i.e., the focal length $f$), respectively. The deviation angle $\varepsilon$ is the integration of the refractive index gradient along the ray path:
Here, ${\varepsilon ^{(\alpha )}}$ is the deviation angle in the $\alpha$ direction. ${n_0}$ is the environmental refractive index and $ray$ is the ray path. Under the assumption of paraxial recording and a small deviation angle, the relationship between the deviation angle and displacement in the imaging plane is given by [41]
Because the experimental setup is relatively simple, the BOS setup can be conveniently arranged as a multidirectional measurement device. As shown in Fig. 2(a), multiple synchronized cameras pass through the probe area to capture images of the background. The displacements can be extracted by comparing the distorted and undistorted images of the background. Thus, the refractive index can be reconstructed using displacements. In some cases, it may be necessary to obtain other parameters of the flow field, such as the density and temperature. The refractive index is related to the density using the Gladstone–Dale equation [20].
Here, n is the refractive index, and $\rho$ is the density. Furthermore, ${K_{GD}}$ is the Gladstone–Dale constant, which is a function of the light wavelength. Combined with the ideal gas law, the temperature T can be obtained from the density, as follows:
Here, P represents the atmospheric pressure, M represents the molar mass of the gas, and R represents the universal gas constant. Therefore, the temperature can be calculated using the refractive index.
3. BOST
3.1 BOS projection model based on RBF
To obtain the distribution of the refractive index directly, a BOS projection model based on the 3D RBF is proposed. The first basis-function-based CT was presented by Hanson and Wecksung [42]. This method is mainly used in 2D image reconstruction [42–46], but its application in 3D field reconstruction has rarely been reported. By combining the basis function with the BOS projection model, the distribution of the refractive index can be obtained directly instead of the distribution of the refractive index gradient.
As shown in Fig. 2(b), the probe area is discretized into grids of $L \times W \times H$, where each grid is a cube with side length d. For each grid, there is a 3D RBF with a certain coverage range, and the centroid of the RBF is at the center of the grid. An RBF area with a coverage range of $3d$ is shown in Fig. 2(b). In this approach, a function is represented as a linear expansion of the basis functions. It is assumed that the refractive index distribution in the 3D space is $n({\textbf x})$, and the basis function is $g({\textbf x})$. Therefore, $n({\textbf x})$ can be approximated with a set of N basis functions,
Here, $N = L \times W \times H$ is the number of basis functions, and, ${\textbf x} = {[x,y,z]^\textrm{T}}$ is a point in the probe area. ${g_i}$ is the ith basis function, and ${c_i}$ is the coefficient of ${g_i}$. Note that typically the coverage of the basis function is larger than one grid, therefore, the basis function will overlap with the surroundings. This work employs Gaussian RBFs which are defined as follows:
Here, ${{\textbf x}_i} = {[{x_i},{y_i},{z_i}]^\textrm{T}}$ is the centroid of ${g_i}$, $\sigma$ is the standard deviation of ${g_i}$, ${d_{th}}$ is the threshold distance (e.g., ${d_{th}} = 3\sigma$), and ${||\cdot ||_2}$ is the Euclidean norm. The α-direction gradient of $n({\textbf x})$ for $\alpha \in \{ x,y,z\}$, is
One can express the standard BOST measurement model using this basis. To begin, the α-direction deflection of the jth ray is given by a line-of-sight (LoS) integral over ${\nabla _\alpha }n({\textbf x})$ alone that ray,
All the coefficient of basis function and deflection data are collected into the vector, ${\textbf c} = \{ {c_i}\} _{i = 1}^N$ and ${{\mathbf \varepsilon }^{(\alpha )}} = \{ \varepsilon _j^{(\alpha )}\} _{j = 1}^K$, respectively. And ${{\textbf T}^{(\alpha )}}$ (defined in Eq. (10)) is formed for $i \in 1,2,\ldots ,N$, $j \in 1,2,\ldots ,K$, and $\alpha \in \{ x,y,z\}$. The result is a $3K \times N$ linear equation:
3.2 Calculation of the BOS projection matrix
To calculate the BOS projection matrix ${\textbf T}$, one must first know the ray trajectory equation corresponding to the sampling point on the image. Therefore, the pinhole model is employed to describe the camera imaging process. In the pinhole model, a 3D point ${(x,y,z,1)^T}$ in the world coordinate system is imaged as ${(u,v,1)^T}$ in the imaging plane by
Here, ${\textbf P}$ is the camera projection matrix, ${\textbf K}$ is the intrinsic matrix of the camera, ${\textbf R}$ is a rotation matrix, ${\textbf t}$ is the translation vector, and $\varsigma$ is a nonzero scale factor. After one obtains the camera projection matrix by camera calibration, the ray trajectory equation of the sampling point ${(u,v)^T}$ can be calculated by
Therefore, the points at which the ray enters and exits the probe area can be calculated. The grids that are crossed by the ray can also be calculated sequentially. The grids crossed by the rays are illustrated in Fig. 3. Then, the weights of these grids can be calculated one by one using Eq. (10). However, a ray only passes through a small part of the grids along the ray path; thus, most of the elements in the matrix ${\textbf T}$ are zero. Therefore, the matrix ${\textbf T}$ is stored as a sparse matrix. The programs are written in the c++ programming language. And each non-zero element in ${\textbf T}$ is represented by a triple, in which two variables of int type represent the position of the element and a variable of float type represents the weight. Therefore, the memory consumption can be significantly reduced.
The ray is discretized in the calculation. The discrete step can be set to $d/10$ [47]. A 3D RBF overlaps with the surrounding RBF. The overlap range depends on the coverage of the RBF. Therefore, the value of the RBF at a certain point should be added to the value of the surrounding RBF. To facilitate the calculation, the coverage of the RBF is generally set to an odd number. The rays passing through the top or bottom of the probe area should not be used in the reconstruction. As shown in Fig. 2(b), because there may still be a flow field above or below the probe area, the path used to calculate the integral is incomplete. Thus, rays that pass through the top or bottom of the probe area should be excluded.
The displacement in the image is a 2D vector $(\Delta u,\Delta v)$, whereas the deviation angle used in Eq. (11) is a 3D vector $({\varepsilon _x},{\varepsilon _y},{\varepsilon _z})$ — that is, the image displacement is decomposed into three directions of the world coordinate system, as given by Eq. (14).
Here, ${r_{ij}}$ is the element in the rotation matrix of the camera obtained by camera calibration.
3.3 Reconstruction
The Eq. (11) is commonly ill-posed in BOS theory. The projection matrix ${\textbf T}$ is typically large in BOS theory; thus, the matrix-theory-based method is computationally unpractical. Iterative reconstruction algorithms are widely used for such problems. Liu et al. [37] showed that ART outperforms the simultaneous iterative reconstruction technique and the Landweber algorithm in BOS reconstruction. Therefore, the ART algorithm was used in the experiments. The iterative process of ART is expressed as
Here, ${{\textbf c}^{(i,k)}}$ is the result updated by the i-th equation after the k-th iteration, and the initial guess can be filled with zero. In addition, ${\varepsilon _i}$ is the i-th component of the vector ${\mathbf \varepsilon }$, ${{\textbf T}_i}$ represents the i-th row of the projection matrix ${\textbf T}$, and $\lambda$ is the relaxation parameter.
Using ART, it is relatively easy to incorporate prior knowledge into the reconstruction process. Total variation (TV) regulation [48] was originally presented for image denoising, which can suppress noise while retaining the edge and sharpness of the image [49]. The TV regularization was used in BOST for the first time by Grauer et al [35], and achieved excellent results. There are often a lot of noise in the results of ART; therefore, TV regulation is introduced into the ART reconstruction process. The TV value is expressed as
Here, $TV ({\textbf c})$ is the TV of ${\textbf c}$ in the 3D discrete form. The minimization of $TV ({\textbf c})$ can be written as
The result of ART is updated by TV regularization based on the split Bregman method [50] at each iteration, and the updated result is used as the initial value of the next ART iteration.
The overall process of the 3D BOST is shown in Fig. 4. The detailed steps are listed below.
- (1) Multicamera calibration
- (2) Capturing the undistorted image and distorted image
- (3) Extracting the displacements by a cross-correlation algorithm based on the image deformation and multigrid [51]
- (4) Calculating the BOS projection matrix based on the 3D RBF and ray tracing
- (5) Converting 2D displacements into 3D deviations
- (6) Reconstruction based on ART and TV regularization
4. Experiments and results
4.1 Numerical simulation
First, the feasibility of this method was validated by simulation data. Twelve simulated cameras were evenly distributed within a range of $165^\circ$. The distances from the camera to the probe area and background plate are 600 and 1300 mm, respectively. The temperature field used is flame-like:
There are four small flames surrounding a large flame evenly at a distance of 25 mm. The diameter and height of large flame are 20 mm and 80 mm, respectively. The diameter and height of the small flame are 10 mm and 40 mm, respectively. The max value of temperature is ∼1500 K. The forward projection process can be simulated according to Eqs. (1) and (2); thus, one can calculate the displacements in the imaging plane of each camera under the influence of the temperature field by ray tracing. For example, the displacements in the fourth camera are shown in Fig. 5(a).
In real experiments, the errors caused by the vibration, light fluctuation, displacement extraction, and image noise may influence the reconstruction. Therefore, to evaluate the influence of noise, we added 0%–20% Gaussian noise to the displacements. The mathematical expression for adding noise is as follows:
Here, $GN$ is the Gaussian noise distributed between 0 and 1, $NL$ is the noise level, ranging from 0% to 20%, and $MD$ is the maximum displacement of the image. The displacements with different Gaussian noise are shown in Fig. 5. The size of the probe area is $100 \times 100 \times 100$ mm and is divided into $100 \times 100 \times 100$ grids. In each imaging plane, $167 \times 167$ points were selected. Therefore, the size of the matrix ${\textbf T}$ is $1004004 \times 1000000$ and is much larger in the following real experiments. Therefore, the matrix ${\textbf T}$ is stored as a sparse matrix to reduce the memory consumption.
To investigate the effect of RBF coverage on reconstruction results, the reconstruction is performed under different RBF coverages. The coverage of the RBF is determined by the $\sigma$ in Eq. (6). Moreover, the mean relative error ($MRE$) can be calculated. The MRE is given by
where $MRE$ is the mean relative error, and $V^{\prime}$ and V are the reconstructed and true values of the field, respectively. The reconstruction results are shown in Fig. 6. And reconstruction was also performed using the method of Nicolas et al. (FDM) at the same grid number. The figure shows that small RBF coverage or FDM produces much noise, and larger errors are produced at the bottom of the flame. The reconstruction of FDM at the edges is unsatisfactory (i.e., left of the figure), therefore, the $MRE$ is relatively large. Although $MRE$ is comparable, the large coverage yields smoother results with fewer artifacts. However, the memory consumption also increases with the RBF coverage. Therefore, considering the reconstruction quality and memory consumption, the coverage of the RBF is generally selected as $7d$ or $9d$.Slices of the reconstruction results are shown in Fig. 7. The figure shows that the BOS projection model can reconstruct the temperature field accurately. However, owing to the influence of Gaussian noise, there are also many noises in the reconstruction results of the ART algorithm. Moreover, in the BOS experiment, the displacement is relatively small; thus, the reconstruction is easily influenced by noise. TV regularization can effectively suppress these noises and make the results smoother. Figure 8 shows the profiles of the reconstruction results. The results of the TV regularization are smoother and have less noise. Especially at the edge of the reconstruction area, noise can be effectively reduced. The figure shows that the TV regularization can constrain the results effectively at different noise level. As shown in Fig. 9, the $MRE$ of the different noise levels are calculated from 0% to 20%. The $MRE$ under TV regularization is significantly reduced in both methods, however the $MRE$ of our method increases more slowly as the noise level increases.
4.2 Experimental setup and calibration
The experimental setup consisted of 15 AVT Guppy F-125B cameras mounted on an ${\sim} 160^\circ$ arc surrounding the hot air above the candle flame, as shown in Fig. 10. A Sony ICX445 1292- × 964-pixel sensor was included in each camera and attached to a lens with a focal length of 12 mm. All cameras were synchronized by an external trigger signal, and the frequency of the trigger signal was 30 Hz. Three background plates painted with a sine wave pattern were fixed in a semicircle approximately 750 mm away from the candle, and the distance from the candle to the camera was approximately 650 mm. The background plates were illuminated using six light-emitting diode light sources.
The main problem of this multicamera calibration is that it is impossible for all the cameras to view the calibration plate. Therefore, the multicamera calibration procedure was performed using the flexible multicamera calibration method with a rotating calibration plate [52]. The 15 cameras were divided into three groups, with five cameras in each group. A chessboard calibration plate was placed in the probe area, as shown in Fig. 11(a). When the calibration plate rotated, all cameras simultaneously captured the images of the calibration plate. Within each group, the 3D world coordinates of the rotating calibration points were calculated by multiview stereo vision. Then, by using the overlapping calibration points of the adjacent group, the three groups of world coordinate systems could be transferred to a unified world coordinate system. Finally, the positions and orientations of the cameras were calculated based on the pinhole model.
The 3D coordinates of the rotating points of the three groups are shown in Fig. 11(b). The calibration results are shown in Fig. 12. The reprojection errors of all cameras were within 0.3 pixels. The absolute mean error (AME) and standard deviation (Std) of the reprojection error along the u-axis were 0.02 and 0.0257 pixels, respectively, and the AME and Std of the reprojection error along the v-axis were 0.025 and 0.033 pixels, respectively. After the intrinsic and extrinsic parameters of the cameras have been obtained, the ray path of any point in the image can be calculated. The four calculated ray paths are shown in Fig. 12(a).
4.3 Experimental results
By comparing the distorted and undistorted images, the displacements in the imaging plane can be calculated using the cross-correlation algorithm [51]. As shown in Fig. 13, $126 \times 185$ sample points were selected in each image. The size of the probe area was $120 \times 120 \times 192$ mm, and it was divided into $200 \times 200 \times 320$ grids.
The projection matrix was calculated according to the description given in Section 3 and Nicolas et al.’s method (FDM) [32]. The reconstruction process was performed using ART and TV regularization. To validate the correctness of the method, 12 cameras (green in Fig. 10) were selected to participate in BOS reconstruction, and the other three cameras (blue in Fig. 10) were used for validation. After the reconstruction result was obtained, the reprojection displacements of the other three cameras could be calculated according to Eqs. (1) and (2) by using ray tracing. The comparisons between the measurement displacements calculated by the cross-correlation algorithm and the reprojection displacements calculated by ray tracing are shown in Fig. 14(a). The correlation coefficient (CC) was used to indicate the similarity of each pair of images. The CC of images A and B is given by
Here, ${\mathop{\rm cov}} (A,B)$ is the covariance of images A and B, and ${\mathop{\rm var}} (A)$ is the variance of image A. The CCs of each pair of images were calculated to be 0.94, 0.91, and 0.90, which is larger than the result of FDM. The figure shows that the measurement displacements and reprojection displacements are similar to each other. Further, we conducted a more detailed analysis of measurement displacements and reprojection displacements. We use Euclidean distance to evaluate the error between two displacements. As shown in Fig. 14(b), the position with a larger error is clearly visible in the figure. The error of FDM is larger in the lower part of the plume. Compared to FDM, our method produces smaller errors. Besides, the distribution of the reprojection errors is shown in Fig. 16. The AME and Std of the reprojection error along the u-axis are 0.049 and 0.080 pixels, and the AME and Std of the reprojection error along the v-axis are 0.027 and 0.041 pixels, respectively. Only a few errors are relatively large, and 90% of errors are less than 0.05 pixels. The comparisons between our method and Nicolas et al.’s method in different frames are listed in Table 1. The slices of results are shown in Fig. 15. The figure reveals that a complicated structure can also be reconstructed. However, we notice that there are the petal-like artifacts in the surrounding area, which is similar to the results in Ref. [32]. In this article, though the “3D mask” can improve the accuracy, the artifacts still exist. The number of petal-like artifacts are related to the number of projection angle in the linear tomography [53]. In the results of FDM, the refractive index is larger than ours in the lower part of the flow. This problem can be improved by using a “3D mask” [32]. However, the “3D mask” is not employed in our method. Besides, the gradient of refractive index is shown in Fig. 17. In this experiment, 50 frames of BOS images were captured at a frequency of 30 Hz, and BOS reconstruction was performed using the proposed method. The reconstruction results for five different frames are shown in Fig. 18. The quantitative reprojection errors were calculated as described above. The details are listed in Table 1. The reprojection errors of different frames are approximately the same; therefore, the stability of the method is verified.
5. Conclusion
A discrete projection model based on a 3D RBF was proposed for BOST. Using this method, the 3D flow field distribution can be obtained directly without an additional integration step or additional finite-difference matrix. Firstly, the accuracy of this method was verified by simulation. The RBF coverage of $7d$ or $9d$ is recommended in the BOST by the simulation. Besides, the refractive index of the hot air flow above a candle flame was reconstructed using 12 cameras. The reprojection displacements in the other three cameras were calculated by ray tracing. Comparing the measurement displacements calculated by the cross-correlation algorithm demonstrated that the reprojection displacements were similar to the measurement displacements. This method can obtain accurate results and has excellent anti-noise ability. It can provide a feasible method for flow field measurements. However, there are many rays that need to be processed; thus, the calculation of the projection matrix is time consuming, requiring approximately 23 h in the experiments, although 15 CPU threads were used. Therefore, in future research, simplification of the BOS projection matrix to reduce the time consumption and memory usage is planned. In addition, in some experimental scenarios, there may be vibrations, water mist or light source fluctuation that influence the experiment. This will have a great impact on the extraction of the BOS image displacements. Displacements may be generated where there should not be displacements. In future research, we will focus on these problems.
Funding
National Natural Science Foundation of China (62175110).
Disclosures
The authors declare no conflicts of interest.
Data Availability
The data underlying the results presented are not publicly available at this time but may be obtained from the authors upon reasonable request.
References
1. L. L. Couch, D. A. Kalin, and T. McNeal, “Experimental investigation of image degradation created by a high-velocity flow field,” in Proceedings of SPIE - The International Society for Optical Engineering (1991), pp. 417–423.
2. B. Lam and C. Guo, “Complete characterization of ultrashort optical pulses with a phase-shifting wedged reversal shearing interferometer,” Light: Sci. Appl. 7(1), 30 (2018). [CrossRef]
3. J. M. Weisberger, B. F. Bathel, G. C. Herring, G. M. Buck, S. B. Jones, and A. A. Cavone, “Multi-point line focused laser differential interferometer for high-speed flow fluctuation measurements,” Appl. Opt. 59(35), 11180–11195 (2020). [CrossRef]
4. W. SchöPf, J. C. Patterson, and A. Brooker, “Evaluation of the shadowgraph method for the convective flow in a side-heated cavity,” Exp. Fluids 21(5), 331–340 (1996). [CrossRef]
5. M. J. Hargather and G. S. Settles, “Retroreflective shadowgraph technique for large-scale flow visualization,” Appl. Opt. 48(22), 4449–4457 (2009). [CrossRef]
6. P. Krehl and S. Engemann, “August Toepler — The first who visualized shock waves,” Shock Waves 5(1-2), 1–18 (1995). [CrossRef]
7. A. K. Agrawal, N. K. Butuk, S. R. Gollahalli, and D. Griffin, “Three-dimensional rainbow schlieren tomography of a temperature field in gas flows,” Appl. Opt. 37(3), 479–485 (1998). [CrossRef]
8. P. Aleiferis, A. Charalambides, Y. Hardalupas, N. Soulopoulos, A. M. K. P. Taylor, and Y. Urata, “Schlieren-based temperature measurement inside the cylinder of an optical spark ignition and homogeneous charge compression ignition engine,” Appl. Opt. 54(14), 4566–4579 (2015). [CrossRef]
9. N. P. V. Hinsberg and T. Rösgen, “Density measurements using near-field background-oriented Schlieren,” Exp. Fluids 55(4), 1720 (2014). [CrossRef]
10. G. S. Settles, Schlieren and shadowgraph techniques: visualizing phenomena in transparent media (Springer Science & Business Media, 2001).
11. M. R. Davis, “Measurements in a subsonic turbulent jet using a quantitative schlieren technique,” J. Fluid Mech. 46(4), 631–656 (1971). [CrossRef]
12. W. L. Howes, “Rainbow schlieren and its applications,” Appl. Opt. 23(14), 2449–2460 (1984). [CrossRef]
13. A. Schwarz, “Multi-tomographic flame analysis with a schlieren apparatus,” Meas. Sci. Technol. 7(3), 406–413 (1996). [CrossRef]
14. A. Srivastava, K. Muralidhar, and P. K. Panigrahi, “Reconstruction of the concentration field around a growing KDP crystal with schlieren tomography,” Appl. Opt. 44(26), 5381–5392 (2005). [CrossRef]
15. A. Z. Nazari, Y. Ishino, F. Ito, H. Kondo, and S. Nakao, “Quantitative Schlieren Image-Noise Reduction Using Inverse Process and Multi-Path Integration,” J. Flow Control., Meas. & Vis. 08(2), 25–44 (2020). [CrossRef]
16. S. B. Dalziel, G. O. Hughes, and B. R. Sutherland, “Whole-field density measurements by ‘synthetic schlieren’,” Exp. Fluids 28(4), 322–335 (2000). [CrossRef]
17. M. Raffel, C. Tung, H. Richard, Y. Yu, and G. E. A. Meier, “Background oriented stereoscopic schlieren (BOSS) for full-scale helicopter vortex characterization,” in The Millennium 9th International Symposium on Flow Visualization (2000), pp. 450.
18. G. S. Settles and M. J. Hargather, “A review of recent developments in schlieren and shadowgraph techniques,” Meas. Sci. Technol. 28(4), 042001 (2017). [CrossRef]
19. H. Richard and M. Raffel, “Principle and applications of the background oriented schlieren (BOS) method,” Meas. Sci. Technol. 12(9), 1576–1585 (2001). [CrossRef]
20. H. Richard, M. Raffel, M. Rein, J. Kompenhans, and G. E. A. Meier, “Demonstration of the applicability of a Background Oriented Schlieren (BOS) method,” in Laser Techniques for Fluid Mechanics, Springer145–156 (2002).
21. F. Sourgen, J. Haertig, and C. Rey, “Comparison Between Background Oriented Schlieren Measurements (BOS) and Numerical Simulations,” in 24th AIAA Aerodynamic Measurement Technology and Ground Testing Conference (2004).
22. L. Venkatakrishnan and G. E. A. Meier, “Density measurements using the Background Oriented Schlieren technique,” Exp. Fluids 37(2), 237–247 (2004). [CrossRef]
23. L. Venkatakrishnan, “Density Measurements in an Axisymmetric Underexpanded Jet by Background-Oriented Schlieren Technique,” Aiaa J. 43(7), 1574–1579 (2005). [CrossRef]
24. F. Sourgen, F. Leopold, and D. Klatt, “Reconstruction of the density field using the Colored Background Oriented Schlieren Technique (CBOS),” Opt. Lasers Eng. 50(1), 29–38 (2012). [CrossRef]
25. F. Leopold, M. Ota, D. Klatt, and K. Maeno, “Reconstruction of the Unsteady Supersonic Flow around a Spike Using the Colored Background Oriented Schlieren Technique,” J. Flow Control. Meas. & Vis. 01(02), 69–76 (2013). [CrossRef]
26. M. Ota, K. Hamada, and K. Maeno, “Quantitative 3D density measurement of supersonic flow by colored grid background oriented schlieren (CGBOS) technique,” in 27TH International Congress of the Aeronautical Sciences (2010), pp. 1182–1188.
27. M. Ota, K. Hamada, H. Kato, and K. Maeno, “Computed-tomographic density measurement of supersonic flow field by colored-grid background oriented schlieren (CGBOS) technique,” Meas. Sci. Technol. 22(10), 104011 (2011). [CrossRef]
28. M. Ota, H. Kato, R. Sakamoto, and K. Maeno, Quantitative Measurement and Reconstruction of 3D Density Field by CGBOS (Colored Grid Background Oriented Schlieren) Technique (Springer Berlin Heidelberg, 2012).
29. M. Ota, K. Kurihara, K. Aki, Y. Miwa, T. Inage, and K. Maeno, “Quantitative density measurement of the lateral jet/cross-flow interaction field by colored-grid background oriented schlieren (CGBOS) technique,” J. Visualization 18(3), 543–552 (2015). [CrossRef]
30. B. Atcheson, I. Ihrke, W. Heidrich, A. Tevs, D. Bradley, M. Magnor, and H.-P. Seidel, “Time-resolved 3d capture of non-stationary gas flows,” ACM Trans. Graph. 27(5), 1–9 (2008). [CrossRef]
31. M. F. Zeb, M. Ota, and K. Maeno, “Quantitative Measurement of Heat Flow in Natural Heat Convection Using Color-Stripe Background Oriented Schlieren (CSBOS) Method,” Journal of JSEM. 11, s141–s146 (2011). [CrossRef]
32. F. Nicolas, V. Todoroff, A. Plyer, G. Le Besnerais, D. Donjat, F. Micheli, F. Champagnat, P. Cornic, and Y. Le Sant, “A direct approach for instantaneous 3D density field reconstruction from background-oriented schlieren (BOS) measurements,” Exp. Fluids 57(1), 13 (2016). [CrossRef]
33. F. Nicolas, D. Donjat, O. Léon, G. Le Besnerais, F. Champagnat, and F. Micheli, “3D reconstruction of a compressible flow by synchronized multi-camera BOS,” Exp. Fluids 58(5), 46 (2017). [CrossRef]
34. F. Nicolas, D. Donjat, A. Plyer, F. Champagnat, G. Le Besnerais, F. Micheli, P. Cornic, Y. Le Sant, and J. M. Deluc, “Experimental study of a co-flowing jet in ONERA’s F2 research wind tunnel by 3D background oriented schlieren,” Meas. Sci. Technol. 28(8), 085302 (2017). [CrossRef]
35. S. J. Grauer, A. Unterberger, A. Rittler, K. J. Daun, A. M. Kempf, and K. Mohri, “Instantaneous 3D flame imaging by background-oriented schlieren tomography,” Combust. Flame 196, 284–299 (2018). [CrossRef]
36. A. Aminfar, J. Cobian-Iñiguez, M. Ghasemian, N. Rosales Espitia, D. R. Weise, and M. Princevac, “Using Background-Oriented Schlieren to Visualize Convection in a Propagating Wildland Fire,” Combust. Sci. Technol. 192(12), 2259–2279 (2020). [CrossRef]
37. H. Liu, H. Jianqing, L. Li, and W. Cai, “Volumetric imaging of flame refractive index, density, and temperature using background-oriented Schlieren tomography,” Sci. China Technol. Sci. 64(1), 98–110 (2021). [CrossRef]
38. H. Liu, C. Shui, and W. Cai, “Time-resolved three-dimensional imaging of flame refractive index via endoscopic background-oriented Schlieren tomography using one single camera,” Aerosp. Sci. Technol. 97, 105621 (2020). [CrossRef]
39. G. Meier, “Computerized background-oriented schlieren,” Exp. Fluids 33(1), 181–187 (2002). [CrossRef]
40. S. J. Grauer and A. M. Steinberg, “Fast and robust volumetric refractive index measurement by unified background-oriented schlieren tomography,” Exp. Fluids 61(3), 80 (2020). [CrossRef]
41. M. Raffel, H. Richard, and G. E. A. Meier, “On the applicability of background oriented optical tomography for large scale aerodynamic investigations,” Exp. Fluids 28(5), 477–481 (2000). [CrossRef]
42. K. M. Hanson and G. W. Wecksung, “Local basis-function approach to computed tomography,” Appl. Opt. 24(23), 4028–4039 (1985). [CrossRef]
43. M. Schweiger and S. R. Arridge, “Image reconstruction in optical tomography using local basis functions,” J. Electron. Imaging 12(4), 583–593 (2003). [CrossRef]
44. C. Byrne, D. Gordon, and D. Heilper, “Models for biomedical image reconstruction based on integral approximation methods,” in 2012 9th IEEE International Symposium on Biomedical Imaging (ISBI) (2012), pp. 70–73.
45. S. Ellis and A. J. Reader, “Kernelised EM image reconstruction for dual-dataset PET studies,” in 2016 IEEE Nuclear Science Symposium, Medical Imaging Conference and Room-Temperature Semiconductor Detector Workshop (NSS/MIC/RTSD) (2016), pp. 1–3.
46. A. Biguri, H. Towsyfyan, R. Boardman, and T. Blumensath, “Numerically robust tetrahedron-based tomographic forward and backward projectors on parallel architectures,” Ultramicroscopy 214, 113016 (2020). [CrossRef]
47. E. Schairer, L. K. Kushner, and J. T. Heineck, “Measurements of Tip Vortices from a Full-Scale UH-60A Rotor by Retro- Reflective Background Oriented Schlieren and Stereo Photogrammetry,” (2013).
48. L. I. Rudin, S. Osher, and E. Fatemi, “Nonlinear total variation based noise removal algorithms,” Physica D 60(1-4), 259–268 (1992). [CrossRef]
49. D. Strong and T. Chan, “Edge-preserving and scale-dependent properties of total variation regularization,” Inverse Probl. 19(6), S165–S187 (2003). [CrossRef]
50. T. Goldstein and S. Osher, “The Split Bregman Method for L1-Regularized Problems,” SIAM J. Imaging Sci. 2(2), 323–343 (2009). [CrossRef]
51. F. Scarano, “Iterative image deformation methods in PIV,” Meas. Sci. Technol. 13(1), R1–R19 (2002). [CrossRef]
52. H. J. Cai, Y. Song, Y. Q. Shi, Z. Cao, Z. Y. Guo, Z. H. Li, and A. Z. He, “Flexible multicamera calibration method with a rotating calibration plate,” Opt. Express 28(21), 31397–31413 (2020). [CrossRef]
53. C. Wei, K. K. Schwarm, D. I. Pineda, and R. Mitchell Spearrin, “Physics-trained neural network for sparse-view volumetric laser absorption imaging of species and temperature in reacting flows,” Opt. Express 29(14), 22553–22566 (2021). [CrossRef]