Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

One-shot color mapping of a ray direction field for obtaining three-dimensional profiles integrating deep neural networks

Open Access Open Access

Abstract

A method for simultaneously and instantly obtaining both a three-dimensional (3D) surface and its inclination angle distribution from a single image captured by an imaging system equipped with a coaxial multicolor filter that integrates deep neural networks (DNNs) is proposed. The imaging system can obtain a light-ray direction in the field of view through one-shot color mapping. Light rays reflected from a 3D surface, even if it has microscale height variations with a small inclination angle distribution, can be assigned different colors depending on their directions by the imaging system. This enables the acquisition of the surface inclination angle distribution. Assuming a smooth and continuous 3D surface, it is possible to reconstruct the surface from a single captured image using DNNs. The DNNs can provide the height variations of the 3D surface by solving a nonlinear partial differential equation that represents the relationship between height variation and the direction of light rays. This method is validated analytically and experimentally using microscale convex surfaces.

© 2023 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

In many manufacturing processes, products undergo surface inspection for quality control. The surface inspection is necessary to detect defects on the surfaces of the products that could affect their functionality or aesthetics. Additionally, the surface inspection often requires a measurement of the three-dimensional (3D) surface of the defects to identify their types and characteristics for a more detailed quality check. There is a conventional 3D surface measurement technique using an image sensor, namely, the light-section method. This method projects a thin line onto the surface of an object with a laser, continuously captures multiple images during scanning, and calculates the 3D shape based on the deformation of the projected line in the images. Another method utilizes projecting various patterns of structured light onto the surface of an object using a conventional camera. This method captures successive images, and the 3D surface can be calculated from these images [1]. However, the processing time for multiple images may interrupt the manufacturing process.

An inclination angle of a 3D surface, defined as an angle between a normal vector on the surface and the optical axis of a camera, is also an important property to consider during inspection. For example, in metal bonding processes such as welding, soldering, and 3D printing, the contact angle of molten metal on a base plate, which corresponds to the inclination angle at the edge of the molten metal surface, serves as an indicator of adhesion and needs to be measured for quality control [2,3]. It is also necessary to inspect the contact angle of a liquid droplet on solid dry substrates in industrial processes, such as coating and inkjet printing, for quality control [4,5].

However, when the inclination angle of a 3D surface is smaller, the intensity contrast of the surface in an image captured by a conventional camera is reduced. Specifically, microscale 3D surfaces, including microscale defects with a typical height of less than several tens of micrometers, exhibit smaller inclination angle distributions. As a result, accurately measuring both the microscale 3D surfaces and their inclination angle distributions becomes difficult. Moreover, minimizing the number of required images for reconstructing the 3D surface is desirable to shorten the inspection time.

An imaging system for one-shot color mapping of a reflectance direction field using a coaxial multicolor filter can capture a clear image of the inclination angle distribution of a 3D surface, even if the 3D surface is microscale in height [6]. The reflectance direction can be described by the bidirectional reflectance distribution function (BRDF) [710]. The imaging system is thus called a one-shot BRDF imaging system, or for brevity, one-shot BRDF, in this paper. The BRDF is also utilized in several notable methods for obtaining 3D profiles [1114]. The inclination angle obtained using the one-shot BRDF is an angle relative to the optical axis of the imaging lens of the system. However, reconstructing the 3D surface from the captured image is difficult because it requires solving a nonlinear partial differential equation that describes the relationship between the 3D surface and the captured image. When a stripe multicolor filter is used instead of the coaxial multicolor filter, the 3D surface can be reconstructed [15]. However, the inclination angle with respect to the optical axis cannot be obtained directly from the image captured with the stripe multicolor filter. Although the inclination angle might be obtained by calculating the gradient field from the reconstructed 3D surface, it requires additional procedures and may introduce additional errors. Therefore, it is desirable to reconstruct a 3D surface from its inclination angle distribution, which is captured directly by the one-shot BRDF with coaxial multicolor filter.

A method is therefore proposed to simultaneously obtain both a 3D surface and its inclination angle distribution from a single image captured by the one-shot BRDF equipped with the coaxial multicolor filter, which integrates the equation-driven deep neural networks (DNNs). The equation-driven DNNs are able to construct analytical functions that satisfy specific differential structures without requiring any training data [1618]. This formulation leads to an unsupervised learning method. Assuming that the surface is smooth and continuous, the nonlinear partial differential equation describing the relationship between the 3D surface and the captured image can be derived based on the geometrical optics. The DNNs are utilized to solve the differential equation and reconstruct a 3D surface from a single captured image. In this paper, this method is called OneShot3DNet (ONE-SHOT 3D reconstruction integrating neural-NETwork). It is important to note that the proposed method does not require any training data, setting it apart from the traditional NNs. Additionally, the universal approximation theorem of NNs states that they can approximate any smooth function with high accuracy. These properties make the DNNs well-suited for solving complex problems governed by differential equations.

The remainder of this paper is organized as follows. First, the basic structure of the one-shot BRDF and the surface inclination angle captured by it are described. Second, the reconstruction method integrating the DNNs from a single image captured by the one-shot BRDF is described. Third, analytical verification of the reconstruction method is performed using convex surfaces that can be expressed in terms of a trigonometric function. Fourth, experimental verification of the reconstruction method is performed using a convex surface fabricated on an aluminum plate. Last, conclusions are presented.

2. Surface inclination angle distribution captured by one-shot BRDF

2.1 One-shot BRDF imaging system and light-ray direction field

The schematic basic structure of the one-shot BRDF imaging system is shown in Fig. 1. The one-shot BRDF consists mainly of an illumination system and an imaging system.

 figure: Fig. 1.

Fig. 1. Schematic basic structure of the one-shot BRDF imaging system.

Download Full Size | PDF

The illumination system has an LED, a pinhole, and an illumination lens that can convert the diverging light rays emitted from the LED to parallel light rays toward the surface of an object. The illumination lens is designed based on the Hamiltonian optics [1925]. The imaging system has an imaging lens and a multicolor filter that consists of multiple different transmission spectral regions. The multicolor filter is coaxial with the optical axis of the imaging lens and is placed at the focal plane of the lens, at a distance of f from the principal plane of the lens. The multicolor filter is set parallel to the xy-plane in a Cartesian coordinate system. The origin O of the coordinate system is set in the multicolor filter. The reflected light rays pass through the multicolor filter and are imaged on an image sensor, with their colors determined by their directions. A position vector, r, when projected onto the multicolor filter, represents the position where a light ray with an angle θ to the optical axis passes in the multicolor filter. The projected position vector r on the two-dimensional plane can be derived based on the geometrical optics using the focal length f and a two-dimensional angle vector, θ, having two components of θx and θy as

$${\boldsymbol r} = f{\boldsymbol \theta } = f\left( {\begin{array}{{c}} {{\theta_x}}\\ {{\theta_y}} \end{array}} \right) = f\;tan \theta \left( {\begin{array}{{c}} {cos\;\phi }\\ {sin\;\phi } \end{array}} \right), $$
where ϕ indicates an azimuth angle to the x-direction [26,27]. A light-ray direction angle field of θ can be obtained through the color mapping of the one-shot BRDF equipped with the coaxial multicolor filter. Note that the azimuth angle ϕ is not obtained through the color mapping.

As shown in Fig. 2, the incident light ray and the reflected light ray lie in the plane of incident. Assuming that the surface is smooth and continuous, the reflection approximates regular reflection. Therefore, a normal vector, n, at the object point determines the direction of the reflected light ray. A unit vector, e refl, of the reflected light ray can thus be written on the basis of the law of regular reflection as

$${{\boldsymbol e}_{\textrm{refl}}} = 2({{\boldsymbol n} \cdot {{\boldsymbol e}_z}} ){\boldsymbol n} - {{\boldsymbol e}_z}, $$
where e z denotes a unit vector of the z-direction.

 figure: Fig. 2.

Fig. 2. Plane of incident in which a light ray reflected from an object point on a smooth and continuous surface.

Download Full Size | PDF

2.2 Relationship between light-ray direction angle and 3D surface

A height distribution of a 3D surface can be written by the function h of x and y. As shown in Fig. 2, an object point is set to (x, y, h + z 0), where z = z 0 is a reference plane with a negative value for z 0. The surface normal vector, n, can then be written with the h as

$${\boldsymbol n} = \frac{1}{{\sqrt {1 + {{({{\partial_x}h} )}^2} + {{({{\partial_y}h} )}^2}} }}\left( {\begin{array}{{c}} { - {\partial_x}h}\\ { - {\partial_y}h}\\ 1 \end{array}} \right), $$
where the partial derivative with respect to x is denoted by appending a subscript x to the partial derivative symbol. The angle vector θ can thus be derived with the height h using Eqs. (1)–(3) as
$${\boldsymbol \theta } = \left( {\begin{array}{{c}} {{\theta_x}}\\ {{\theta_y}} \end{array}} \right) = \frac{1}{{1 - {{({{\partial_x}h} )}^2} - {{({{\partial_y}h} )}^2}}}\left( {\begin{array}{{c}} { - 2{\partial_x}h}\\ { - 2{\partial_y}h} \end{array}} \right). $$

Equation (4) can be further transformed using Eq. (1) as

$${({{\partial_x}h} )^2} + {({{\partial_y}h} )^2} = {\left( {\frac{{1 - \sqrt {1 + {{tan }^2}\;\theta } }}{{tan\;\theta }}} \right)^2} = {tan ^2}\frac{\theta }{2}. $$

Equation (5) is a nonlinear partial differential equation and is generally difficult to solve analytically or numerically. Note that the ray direction angle, θ, in Eq. (5) can be obtained by the one-shot BRDF. The inclination angle of the surface, α, that represents an angle between the surface normal vector and the optical axis can be written with the ray direction angle, θ, based on the Eq. (5) as

$$\alpha = \frac{\theta }{2}. $$

2.3 One-shot color mapping of light-ray direction field for microscale convex surface

On the left-hand side in Fig. 3(a) shows a picture of an aluminium disc captured by a conventional camera under ordinary ambient illumination. The disc has a microscale convex surface with height variations at a micrometer scale on its flat surface. On the right-hand side in Fig. 3(b) shows the 3D surface profiles of the microscale convex, which is locally measured over several minutes by a scanning white light interferometric microscope (ZYGO) [28]. The maximum height of the convex surface is 37 µm, and the surface inclination angle, α, of the convex surface is less than 3.0 degrees.

 figure: Fig. 3.

Fig. 3. (a) Picture of an aluminium disc captured by a conventional camera under ordinary ambient illumination. The disc has a microscale convex surface with height variations at a micrometer scale on its flat surface. (b) The 3D surface profiles of the microscale convex, which is locally measured over several minutes by a scanning white light interferometric microscope (ZYGO). The maximum height of the convex surface is 37 µm, and the surface inclination angle, α, of the convex surface is less than 3.0 degrees.

Download Full Size | PDF

Figure 4 shows (a) an unprocessed image of the microscale convex surface captured by the one-shot BRDF and (b) a surface inclination angle distribution derived from the image in (a). In Fig. 4(a), a photograph of a coaxial multicolor filter used for capturing the image is also shown. In this work, the focal length f of the imaging lens (Nikon, AF–S NIKKOR) is set to 105 mm. The focal length of the illumination lens is set to 80 mm. The pinhole size is set to 400 µm. The multicolor filter has 22 color variations of red, blue, green, and yellow, sequenced from the center to the outside with an outermost radius of 11 mm. The image captured by the one-shot BRDF reveals that the convex surface can be clearly detected by color contrast. The pixel size of the image is 4.7 µm. In Fig. 4(b), the surface inclination angle distribution, calculated using Eq. (6) from the ray direction angle θ, is shown. The θ value is obtained from the captured image shown in Fig. 4(a) based on a relationship between hue [2934] and the direction angle. This relationship is measured using a flat aluminum plate by varying its inclination angle with a goniometric device.

 figure: Fig. 4.

Fig. 4. (a) Unprocessed image of the microscale convex surface captured by the one-shot BRDF and a photograph of a multicolor filter used for this imaging. (b) Surface inclination angle distribution calculated using Eq. (6) from the ray direction angle θ, where the θ is obtained from the captured image in (a) based on a relationship between hue and direction angle.

Download Full Size | PDF

3. 3D surface reconstruction using OneShot3DNet

Although it is difficult to solve Eq. (5) analytically due to the equation being a nonlinear partial differential equation, it can be solved using the equation-driven DNNs as follows. The equation-driven DNNs are able to construct analytical functions that satisfy specific differential structures without requiring training data. A loss function, derived from both input spatial coordinates and intermediate parameters generated by the DNNs, serves as the sole evaluation value for optimizing the DNNs, making the training process entirely data-free. This formulation leads to an unsupervised learning method. It is important to note that the proposed method does not require training data. Additionally, the universal approximation theorem of neural networks states that they can approximate any smooth function with high accuracy.

A total loss G is defined here as a summation of the following discretized values for a 3D surface represented by the height h as

$$G = \frac{1}{M}\mathop \sum \nolimits_{i = 1}^M \left( {{{\left( {\frac{{\Delta {h_i}}}{{\Delta {x_i}}}} \right)}^2} + {{\left( {\frac{{\Delta {h_i}}}{{\Delta {y_i}}}} \right)}^2} - {{tan }^2}\left( {\frac{{{\theta_i}}}{2}} \right)} \right)^2, $$
where an identification number of each grid point is set to i (i = 1, …, M) as subscript. Equation (5) can then be written as G = 0 in the form of a discretized equation. Once a light-ray direction field is captured by the one-shot BRDF, the height distribution of hi on the 3D surface can be obtained by approximating the total loss G to zero. This 3D reconstruction method is called OneShot3DNet in this paper. As shown in Fig. 5, the OneShot3DNet iteratively reduces the total loss G toward zero by using the error back propagation algorithm of the DNNs. A set of two-dimensional discretized positions, namely, (x 1, y 1), …, (xM, y M), is provided to the DNNs as input. A set of corresponding light-ray direction angles, namely, θ 1, …, θM, is also provided to calculate the total loss G. The iteration is counted by an iteration number l. The discretized height distribution of hi can be taken as output. In this work, the DNNs comprise a single input layer, three hidden layers, and a single output layer. Each hidden layer is populated with 100 neurons. The activation function of the hidden layer is a hyperbolic tangent function, and the output layer is scaled by a sigmoid function. The DNNs are implemented by using the PyTorch [35]. The total grid number M is set to 65536 where grid numbers in x- and y-directions are both set to 256.

 figure: Fig. 5.

Fig. 5. Schematic view of OneShot3DNet. The OneShot3DNet can iteratively reduce a total loss G toward zero by using the error back propagation algorithm of DNNs. A set of two-dimensional discretized positions, namely, (x 1, y 1), …, (xM, y M), is provided to the DNNs as input. A set of corresponding light-ray direction angles, namely, θ 1, …, θM, is also provided to calculate the total loss G. The iteration is counted by an iteration number l. The discretized height distribution of hi can be taken as output.

Download Full Size | PDF

4. Analytical results for microscale trigonometric surfaces

The OneShot3DNet can be analytically verified using a microscale 3D surface, taking a trigonometric convex surface as one example, as follows. Note that this trigonometric convex surface is not used as training data for optimizing the OneShot3DNet but rather for demonstrating the performance of the OnShot3DNet, since the OnShot3DNet does not require any training data.

A height distribution, h, of the microscale 3D convex surface can be represented using a trigonometric function as

$$h({x,y} )= \frac{{{\alpha _0}{R_0}}}{{2\pi }}\left( {cos\;2\pi \sqrt {{{\left( {\frac{x}{{{R_0}}}} \right)}^2} + {{\left( {\frac{y}{{{R_0}}}} \right)}^2}} + 1} \right), $$
where the α 0 and R 0 are constant values, and the domain region is set as
$$2\pi \sqrt {{{\left( {\frac{x}{{{R_0}}}} \right)}^2} + {{\left( {\frac{y}{{{R_0}}}} \right)}^2}} \le \pi . $$

Outside the domain region, the height h is set to 0. The inclination angle of the convex surface can thus be derived from Eqs. (5), (6) and (8) as

$$tan\;\alpha = \sqrt {{{({{\partial_x}h} )}^2} + {{({{\partial_y}h} )}^2}} = {\alpha _0}sin\;2\pi \sqrt {{{\left( {\frac{x}{{{R_0}}}} \right)}^2} + {{\left( {\frac{y}{{{R_0}}}} \right)}^2}} . $$

Note that the surface inclination angle, α, is defined as a positive value. For modeling a typical microscale defect that has a small inclination angle distribution, the α is set to less than or equal to 10 degrees with R 0 set to 2 mm. In this case, the maximum inclination angle can be approximated by the α 0, according to Eq. (10).

Figure 6 shows the reconstruction results using the OneShot3DNet for the microscale convex surfaces. The convex surfaces have maximum inclination angles, α 0, of 1 and 10 degrees, corresponding to maximum heights of 11 and 111 µm, respectively. (a) shows the reconstructed surfaces in the xy-plane with iteration number l of 20000 for α 0 of 1 and 10 degrees. (b) shows the differences between the reconstruction surfaces and ideal surfaces for α 0 of 1 and 10 degrees. Although the sign of the height value can be reversed due to the symmetry between positive and negative values in Eq. (5), a positive sign is chosen here. An error is defined as the ratio of the difference between a reconstructed value and an ideal value to the maximum height among the ideal values. The maximum errors for α 0 of 1 and 10 degrees are 2.3 and 1.0 percent, respectively. Therefore, the results show that the reconstructed surfaces agree well with the ideal surfaces.

 figure: Fig. 6.

Fig. 6. Analytical 3D reconstruction results using the OneShot3DNet for the microscale convex surfaces with their maximum inclination angles, α 0, of 1 and 10 degrees, respectively, where the corresponding maximum heights are 11 and 111 µm. (a) Reconstructed surfaces in the xy-plane with iteration number l of 20000 for α 0 of 1 and 10 degrees. (b) Differences between the reconstructions and ideals for α 0 of 1 and 10 degrees.

Download Full Size | PDF

5. Experimental result for microscale aluminium convex surface

The OneShot3DNet can also be experimentally verified using a microscale 3D surface, taking the microscale aluminium convex surface shown in Fig. 3 as one example. Note that the OnShot3DNet does not require any training data.

Figure 7 shows experimental 3D reconstruction results using the OneShot3DNet for the microscale aluminium convex surface. The surface inclination angle distribution of α converted using Eq. (6) from the ray direction angle field captured by the one-shot BRDF is previously shown in Fig. 4(b). In Fig. 7(a) shows the reconstruction results in the xy-plane using the OneShot3DNet with iteration number l of 100, 500, 1000, and 10000. Figure 7(b) shows a side view of the reconstructed 3D surface with iteration number l of 20000 and a surface that is locally measured over several minutes by the scanning white light interferometric microscope (ZYGO). An error is defined as the ratio of the difference between a reconstruction value and a measurement value by ZYGO to the maximum height among the measurement values. The maximum error is 7 percent. This reveals that the reconstructed 3D surface agrees well with that measured by ZYGO. The pixel size of the reconstructed 3D surface, namely x- and y-resolution, is 4.7 µm.

 figure: Fig. 7.

Fig. 7. Experimental 3D reconstruction results using the OneShot3DNet for the microscale convex surface shown in Fig. 3. (a) Reconstruction results in the xy-plane with iteration number l of 100, 500, 1000, and 10000. (b) Side view of the reconstructed 3D surface with iteration number l of 20000 and a surface measured by the scanning white light interferometric microscope (ZYGO).

Download Full Size | PDF

6. Conclusions

A method integrating DNNs is proposed for simultaneously obtaining both a 3D surface and its inclination angle distribution from a single image captured by the one-shot BRDF. The one-shot BRDF can color-map a reflectance direction angle field in one shot using a coaxial multicolor filter. The inclination angle distribution of a 3D surface with respect to the optical axis can be obtained using Eq. (6) from the captured image, even if the 3D surface is microscale in height variations. A reconstruction method for 3D surfaces, namely, the OneShot3DNet, is constructed using DNNs to solve the nonlinear partial differential equation of Eq. (5) that describes the relationship between a 3D surface and a single image captured by the one-shot BRDF. Note that the proposed method does not require training data, setting it apart from the traditional NNs. Analytical results show that the OneShot3DNet can successfully reconstruct trigonometric microscale convex 3D surfaces from single images. Furthermore, experimental results demonstrate that the OneShot3DNet can practically reconstruct microscale convex 3D surface that is fabricated on an aluminium plate. The OneShot3DNet can be considered to have potential for various optical inspections in manufacturing processes.

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. J. Geng, “Structured-light 3D surface imaging: a tutorial,” Adv. Opt. Photon. 3(2), 128–160 (2011). [CrossRef]  

2. H. Ohno, Y. Shiomi, S. Tsuno, and M. Sasaki, “Design of a transmissive optical system of a laser metal deposition three-dimensional printer with metal powder,” Appl. Opt. 58(15), 4127–4138 (2019). [CrossRef]  

3. G. Dutra, J. Canning, W. Padden, C. Martelli, and S. Dligatch, “Large area optical mapping of surface contact angle,” Opt. Express 25(18), 21127–21144 (2017). [CrossRef]  

4. M. Q. Santiago, A. A. Castrejón-Pita, and J. R. Castrejon-Pita, “The Effect of Surface Roughness on the Contact Line and Splashing Dynamics of Impacting Droplets,” Sci. Rep. 9(1), 15030 (2019). [CrossRef]  

5. D. Luo, L. Qian, L. Dong, P. Shao, Z. Yue, J. Wang, B. Shi, S. Wu, and Y. Qin, “Simultaneous measurement of liquid surface tension and contact angle by light reflection,” Opt. Express 27(12), 16703–16712 (2019). [CrossRef]  

6. H. Ohno and T. Kamikawa, “One-shot BRDF imaging system to obtain surface properties,” Opt. Rev. 28(6), 655–661 (2021). [CrossRef]  

7. B. K. P. Horn and R. W. Sjoberg, “Calculating the reflectance map,” Appl. Opt. 18(11), 1770–1779 (1979). [CrossRef]  

8. L. Simonot and G. Obein, “Geometrical considerations in analyzing isotropic or anisotropic surface reflections,” Appl. Opt. 46(14), 2615–2623 (2007). [CrossRef]  

9. I. G. E. Renhorn and G. D. Boreman, “Analytical fitting model for rough-surface BRDF,” Opt. Express 16(17), 12892–12898 (2008). [CrossRef]  

10. S. D. Butler, S. E. Nauyoks, and M. A. Marciniak, “Experimental analysis of bidirectional reflectance distribution function cross section conversion term in direction cosine space,” Opt. Lett. 40(11), 2445–2448 (2015). [CrossRef]  

11. R. J. Woodham, “Photometric method for determining sur- face orientation from multiple images,” Opt. Eng. 19(1), 139–144 (1980). [CrossRef]  

12. R. J. Woodham, “Gradient and curvature from the photometric-stereo method, including local confidence estimation,” J. Opt. Soc. Am. A 11(11), 3050–3068 (1994). [CrossRef]  

13. K. Ikeuchi, “Determining a depth map using a dual photometric stereo,” Int. J. Robot. Res. 6(1), 15–31 (1987). [CrossRef]  

14. M. Saito, Y. Sato, K. Ikeuchi, and H. Kashiwagi, “Measurement of surface orientations of transparent objects by use of polarization in highlight,” J. Opt. Soc. Am. A 16(9), 2286–2293 (1999). [CrossRef]  

15. H. Ohno, “One-shot three-dimensional measurement method with the color mapping of light direction,” OSA Conti. 4(3), 840–848 (2021). [CrossRef]  

16. M. Mattheakis, D. Sondak, A. S. Dogra, and P. Protopapas, “Hamiltonian neural networks for solving equations of motion,” Phys. Rev. E 105(6), 065305 (2022). [CrossRef]  

17. H. Ohno and T. Usui, “Neural network gradient-index mapping,” OSA Conti. 4(10), 2543–2551 (2021). [CrossRef]  

18. H. Ohno and T. Usui, “Points-connecting neural network ray tracing,” Opt. Lett. 46(17), 4116–4119 (2021). [CrossRef]  

19. H. Ohno, “Symplectic ray tracing based on Hamiltonian optics in gradient-index media,” J. Opt. Soc. Am. A 37(3), 411–416 (2020). [CrossRef]  

20. H. Ohno, K. Nakagawa, and T. Kamikawa, “Design of secondary light source for reflectors with axisymmetric light guide,” Appl. Opt. 58(14), 3848–3855 (2019). [CrossRef]  

21. H. Ohno and M. Kato, “Total internal reflection shell for light-emitting diode bulbs,” Appl. Opt. 58(1), 87–93 (2019). [CrossRef]  

22. H. Ohno, “Design of a coaxial light guide producing a wide-angle light distribution,” Appl. Opt. 56(14), 3977–3983 (2017). [CrossRef]  

23. H. Ohno and T. Usui, “Gradient-index dark hole based on conformal mapping with etendue conservation,” Opt. Express 27(13), 18493–18507 (2019). [CrossRef]  

24. H. Ohno, “Ghost secondary light source for LED collimated illumination,” Appl. Opt. 59(33), 10339–10344 (2020). [CrossRef]  

25. H. Ohno, “Multi-parabolic illuminator to combine perpendicular collimated illuminations with an LED source,” OSA Conti. 4(8), 2154–2163 (2021). [CrossRef]  

26. H. Ohno and K. Toya, “Localized gradient-index field reconstruction using background-oriented Schlieren,” Appl. Opt. 58(28), 7795–7804 (2019). [CrossRef]  

27. H. Ohno and K. Toya, “Scalar potential reconstruction method of axisymmetric 3D refractive index fields with background-oriented schlieren,” Opt. Express 27(5), 5990–6002 (2019). [CrossRef]  

28. ZYGO Corporation, https://www.zygo.com/?/met/profilers/newview9000/

29. W. L. Howes, “Rainbow schlieren and its applications,” Appl. Opt. 23(14), 2449–2460 (1984). [CrossRef]  

30. H. Ohno, “One-shot color mapping imaging system of light direction extracted from a surface BRDF,” OSA Conti. 3(12), 3343–3350 (2020). [CrossRef]  

31. H. Ohno and H. Kano, “Depth reconstruction with coaxial multi-wavelength aperture telecentric optical system,” Opt. Express 26(20), 25880–25891 (2018). [CrossRef]  

32. H. Ohno and H. Kano, “Dual coaxial lens system for depth reconstruction,” Opt. Rev. 26(5), 500–506 (2019). [CrossRef]  

33. H. Ohno, A. Ohno, and H. Okano, “Imaging technology to immediately obtain clear images of microdefects,” Toshiba Review 76(6), 38–41 (2021).

34. H. Ohno, “Method for instant measurement of 3D microdefect shapes using optical imaging system capable of color mapping of light direction,” Toshiba Review 77, 44–47 (2022).

35. Pytorch, https://pytorch.org/

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. Schematic basic structure of the one-shot BRDF imaging system.
Fig. 2.
Fig. 2. Plane of incident in which a light ray reflected from an object point on a smooth and continuous surface.
Fig. 3.
Fig. 3. (a) Picture of an aluminium disc captured by a conventional camera under ordinary ambient illumination. The disc has a microscale convex surface with height variations at a micrometer scale on its flat surface. (b) The 3D surface profiles of the microscale convex, which is locally measured over several minutes by a scanning white light interferometric microscope (ZYGO). The maximum height of the convex surface is 37 µm, and the surface inclination angle, α, of the convex surface is less than 3.0 degrees.
Fig. 4.
Fig. 4. (a) Unprocessed image of the microscale convex surface captured by the one-shot BRDF and a photograph of a multicolor filter used for this imaging. (b) Surface inclination angle distribution calculated using Eq. (6) from the ray direction angle θ, where the θ is obtained from the captured image in (a) based on a relationship between hue and direction angle.
Fig. 5.
Fig. 5. Schematic view of OneShot3DNet. The OneShot3DNet can iteratively reduce a total loss G toward zero by using the error back propagation algorithm of DNNs. A set of two-dimensional discretized positions, namely, (x 1, y 1), …, (xM, y M), is provided to the DNNs as input. A set of corresponding light-ray direction angles, namely, θ 1, …, θM, is also provided to calculate the total loss G. The iteration is counted by an iteration number l. The discretized height distribution of hi can be taken as output.
Fig. 6.
Fig. 6. Analytical 3D reconstruction results using the OneShot3DNet for the microscale convex surfaces with their maximum inclination angles, α 0, of 1 and 10 degrees, respectively, where the corresponding maximum heights are 11 and 111 µm. (a) Reconstructed surfaces in the xy-plane with iteration number l of 20000 for α 0 of 1 and 10 degrees. (b) Differences between the reconstructions and ideals for α 0 of 1 and 10 degrees.
Fig. 7.
Fig. 7. Experimental 3D reconstruction results using the OneShot3DNet for the microscale convex surface shown in Fig. 3. (a) Reconstruction results in the xy-plane with iteration number l of 100, 500, 1000, and 10000. (b) Side view of the reconstructed 3D surface with iteration number l of 20000 and a surface measured by the scanning white light interferometric microscope (ZYGO).

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

r = f θ = f ( θ x θ y ) = f t a n θ ( c o s ϕ s i n ϕ ) ,
e refl = 2 ( n e z ) n e z ,
n = 1 1 + ( x h ) 2 + ( y h ) 2 ( x h y h 1 ) ,
θ = ( θ x θ y ) = 1 1 ( x h ) 2 ( y h ) 2 ( 2 x h 2 y h ) .
( x h ) 2 + ( y h ) 2 = ( 1 1 + t a n 2 θ t a n θ ) 2 = t a n 2 θ 2 .
α = θ 2 .
G = 1 M i = 1 M ( ( Δ h i Δ x i ) 2 + ( Δ h i Δ y i ) 2 t a n 2 ( θ i 2 ) ) 2 ,
h ( x , y ) = α 0 R 0 2 π ( c o s 2 π ( x R 0 ) 2 + ( y R 0 ) 2 + 1 ) ,
2 π ( x R 0 ) 2 + ( y R 0 ) 2 π .
t a n α = ( x h ) 2 + ( y h ) 2 = α 0 s i n 2 π ( x R 0 ) 2 + ( y R 0 ) 2 .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.