Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Multidimension-multiplexed full-phase-encoding holography

Open Access Open Access

Abstract

I propose a multidimension-multiplexed imaging method with which multiple physical quantities of light are simultaneously obtained as interference fringe images. The varieties of light are distinguished by exploiting the proposed phase-encoding technique. Neither measurements of point spread functions in advance, nor iterative calculations to derive multidimensional information, nor a laser light source is required. Multidimensional imaging of an object and simultaneous three-dimensional image recording of self-luminous light and light transmitted from an object are experimentally demonstrated. A palm-sized interferometer based on the proposed holography is developed for the experiments to show its portability and physical-filter-free multidimensional imaging ability without an antivibration structure.

© 2022 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Light contains multiple physical quantities. Physical quantities such as the amplitude, phase, wavelength, and polarization of light provide multidimensional information such as the transmittance and/or reflectance, three-dimensional (3D) position and shape, refractive index, and spectral and polarimetric properties of an object, phenomenon, or scene. Multidimensional information has been exploited to clarify a variety of characteristics of objects. For example, 3D information is essential for measuring and recognizing the structures of objects and is required in both science and industry, including microscopy [1], product inspection, measurement of a field, and machine vision. Wavelength information is important for identifying the compositions and conditions of objects. Multicolor staining of cells is frequently used in fluorescence microscopy and bright-field microscopy to observe and analyze molecules with different compositions. Even in our daily lives, we assess the physical condition of people from their complexion. Furthermore, the identification of a variety of light waves such as natural and artificial illumination light, self-luminous light, and nonlinear light stimulated by illumination light can be used in the advanced analysis of a measured object. Simultaneous measurement of fluorescence and phase information plays an important role in microscopy [2,3] for the understanding of the structures of specimens and the investigation of the correlation between the fluorescence and phase images of specimens.

Research on multidimensional imaging has been carried out to achieve the sensing of a wealth of physical information. 3D imaging has been conducted with tomography [1], a stereo method, a light-field method, point-spread-function (PSF) engineering with structured light illumination, a spatial light modulator (SLM) [4], an amplitude mask [5], a scattering medium [6,7], and interferometry [812]. Wavelength information is simultaneously acquired by adopting color filters [13], heterodyne techniques with temporal [14,15] and spatial [10,16] carriers, PSF engineering [1719], and encoding with wavelength-dependent phase shifts [20]. Most multidimensional imaging techniques require color and polarization filters to record wavelength and polarimetric information, respectively. In these techniques, the recordable wavelength bands and the extinction ratio of the imaging systems depend on the optical filters. State-of-the-art manufacturing technologies are required to improve the specifications of these imaging techniques. In contrast, multidimension-multiplexed imaging, such as PSF engineering and interferometry, enables us to conduct physical-filter-free multidimensional imaging. PSF engineering modulates complex amplitude distributions by inserting optical element(s) between a light source and objects or between an image sensor and objects. A diffractive or scattering medium is frequently adopted as an optical element to introduce independent PSFs for different physical quantities. Interferometry exploits the modulation of the spatial or temporal frequency or the phase of an interference fringe image to enable multiplexed recording of multidimensional information. Using these multiplexed imaging techniques, multiwavelength 3D imaging [14,15,20], filter-free polarimetric imaging [21], multiframe imaging with a single recorded image [22], and simultaneous imaging of fluorescence light and phase information [2,23] have been performed. Such multiplexed imaging indicates that a computational optical technique can avoid the high requirements for the specifications of optical filters. However, there are problems in these multidimension-multiplexed imaging techniques. PSF engineering generally requires the information of PSFs for 3D space and other physical quantities a priori. Otherwise, iterative procedures are required to retrieve multidimensional information, and a high computational cost and a long time are required for calculations. Moreover, it is difficult to distinguish the varieties of light unless each variety is tagged by a physical quantity such as wavelength. Therefore, multiplexed 3D recording of self-luminous light and light transmitted or reflected from objects has been difficult, particularly when the wavelength bands of respective light are overlapped.

I propose a multidimension-multiplexed imaging technique in which neither measurements of PSFs in advance, nor iterative calculations to derive multidimensional information, nor a laser light source is required. I exploit self-interference incoherent digital holography (SIDH) [13,14,2434] and phase-shifting interferometry (PSI) [8,20] to achieve multidimensional imaging even for spatially and temporally incoherent light. Multidimensional information of different varieties of light is fully encoded in multiplexed holograms by introducing different phase shifts for different physical quantities and varieties of light. Both a 3D image and other physical quantities are simultaneously obtained with incoherent light and a single image sensor, and 3D information of self-luminous light and light transmitted or reflected from objects is obtained with multiplexed images even when the same wavelength is contained in different varieties of light. Hereafter, I term the proposed technique multidimension-multiplexed full-phase-encoding holography (MPH). Furthermore, I develop a palm-sized interferometer based on MPH to show the portability of the optical system and multidimensional imaging ability without an antivibration table. Experiments are conducted on a wood table to show that the optical system based on MPH works well.

2. Multidimension-multiplexed full-phase-encoding holography (MPH)

Figure 1 shows the basic concept of MPH. Figures 1(a) and 1(b) illustrate the recording system and the image-reconstruction procedures, respectively. The main feature of MPH is that phase shifts are exploited to both extract different physical quantities and distinguish the varieties of light. A phase modulator is set in an SIDH system, as shown in Fig. 1(a), to encode multidimensional information of different varieties of light simultaneously in the recording of multiplexed holograms. PSI is utilized to separately extract multiple physical quantities and varieties of the light from the multiplexed images. As shown in Fig. 1(a), both reflection/transmission light coming from an object and self-luminous light are recorded as a multiplexed digital hologram. Spatially incoherent illumination light such as light-emitting diode (LED) light and natural light can be applied to obtain an object wave, which is generated from an object and contains a wealth of physical information of the object. The object waves containing multiple varieties of light propagate to an SIDH system. Both 3D information and other physical quantities of the object waves are converted to an interference fringe image termed a “hologram” by the SIDH system. The multiple physical quantities and varieties of light are multiplexed in the digital hologram. The phase shifts are introduced by the phase modulator, and multidimension-multiplexed phase-shifted self-interference holograms are recorded. Using PSI with the phase-shift information for the different physical quantities and varieties of light, multidimensional information in the object wave on the image sensor plane is separately extracted as illustrated in Fig. 1(b). Wave-propagation calculations such as diffraction integrals are applied to these images, and multidimensional images for the different varieties of light are reconstructed. In recording of quantitative phase information of an object, to adopt self-reference holography [35] for MPH is considered as an approach.

 figure: Fig. 1.

Fig. 1. Basic concept of MPH. (a) Schematic of recording multidimensional information. (b) Flow of image-reconstruction procedures.

Download Full Size | PDF

Figure 2 illustrates an optical implementation of MPH for multiwavelength 3D imaging of different varieties of light. The SIDH system shown in Fig. 2 adopts linear polarizers and birefringent materials to generate a self-interference hologram of spatially incoherent light. I constructed this compact optical system, which can record a digital hologram, as an implementation of MPH although polarization information of an object wave is lost. Figure 3 illustrates the optical geometry of the SIDH system and the polarization transition of the object wave. An object is regarded as the summation of multiple object points in SIDH [13,14,2434]. A spherical light wave diffracted from a point u(xo, yo, zo, k) of an object, where k = 2π/λ is the wavenumber and λ is the wavelength, passes through a polarizer and then is linearly polarized along the 45° direction. A birefringent lens located at z = zl generates two waves from the linearly polarized light according to the polarization direction. This is because the birefringent lens has different focal lengths f1 and f2 for the vertical and horizontal directions, respectively. After that, the phase modulator shifts the phase of one of the two light waves. The two light waves are collected before they diverge. The two collected light points ut1(x1, y1, zt1, k) and ut2(x2, y2, zt2, k) are written as

$${u_{t1}}({x_1},{y_1},{z_{t1}},k) = {c_1}{u_o}\left( { - \frac{{{b_1}}}{a}{x_o}, - \frac{{{b_1}}}{a}{y_o},a + {b_1},k} \right),$$
$${u_{t2}}({x_2},{y_2},{z_{t2}},k) = {c_2}{u_o}\left( { - \frac{{{b_2}}}{a}{x_o}, - \frac{{{b_2}}}{a}{y_o},a + {b_2},k} \right),$$
where
$$a = {z_l} - {z_o}\textrm{,}$$
$${b_j}\textrm{ = }\frac{{a{f_j}}}{{a - {f_j}}}\textrm{, }(j = 1,2),$$
and c is a coefficient. These divergent waves generated from the two light points propagate to the birefringent plate, and the plate introduces a shift-invariant optical-path-length shift. The optical-path-length difference between the two waves, which is generated by the birefringent lens, is adjusted by the plate. This compensation is required when temporally incoherent light is recorded and the coherence length is severely limited. Here I assume that the central wavelength and the full width at half-maximum (FWHM) are 451 nm and 65 nm. Then, its coherence length is estimated as 3.11 µm. On the other hand, the optical-path-length difference for the ray on the optical axis is 36.4 µm when I set a birefringent lens whose thickness and refractive indexes for the orthogonal polarizations are 4 mm, 1.5462, and 1.5553. Therefore, such the difference is adjusted using the plate with the adequate thickness. The points are shifted along the depth direction by the plate and expressed as
$${u_1}({x_1},{y_1},{z_1},k) = {u_{t1}}({x_1},{y_1},{z_{t1}} + {z_{d1}},k),$$
$${u_2}({x_2},{y_2},{z_2},k) = {u_{t2}}({x_2},{y_2},{z_{t2}} + {z_{d2}},k),$$
where
$${z_{dj}}\textrm{ = }\frac{d}{{1 - (1/{n_j})}},$$
d is the thickness of the birefringent plate, θ is the maximum incident angle to the plate, which is limited by the lens, and n is the refractive index of the plate. n is different for each polarization direction. Here, it should be noted that Eqs. (1)–(6) are described by the principle of paraxial imaging based on geometrical optics. The polarization directions of the two light waves are orthogonal and are aligned by the polarizer placed in front of the monochrome image sensor. These two waves interfere on an image sensor placed at depth z = zh, which records the interference fringe image. The hologram of the spatially incoherent point light source I(x,y) is expressed by the following equation, which considers the spectral bandwidth of light:
$$I(x_1,y)= c\int_{ - \infty }^\infty {|{u_{1zh}}(x,y;{x_1},{y_1},{z_1};k) + {u_{2zh}}(x,y;{x_2},{y_2},{z_2};k){|^2}S(k)dk,}$$
$$\begin{aligned}{u_{1zh}}(x,y;{x_1},{y_1},{z_1};k) &= {u_1}({x_1},{y_1},{z_1};k)\exp [ik{r_1}(x,y;{x_1},{y_1},{z_1})]\\ & = |{u_1}({x_1},{y_1},{z_1};k)| \exp [i{\phi _1}({x_1},{y_1},{z_1};k)]\exp [ik\sqrt {{{(x - {x_1})}^2} + {{(y - {y_1})}^2} + {{(z{}_h - {z_1})}^2}} ], \end{aligned}$$
$$\begin{aligned}{u_{2zh}}(x,y;{x_2},{y_2},{z_2};k) &= {u_2}({x_2},{y_2},{z_2};k)\exp [ik{r_2}(x,y;{x_2},{y_2},{z_2}) + i{\phi _{pm}}(k)]\\&= |{u_2}({x_2},{y_2},{z_2};k)| \exp [i{\phi _2}({x_2},{y_2},{z_2};k)]\exp [ik\sqrt {{{(x - {x_2})}^2} + {{(y - {y_2})}^2} + {{(z{}_h - {z_2})}^2}} + i{\phi _{pm}}(k)],\end{aligned}$$
where
$${r_j}(x,y;{x_j},{y_j},{z_j})\textrm{ = }\sqrt {{{(x - {x_j})}^2} + {{(y - {y_j})}^2} + {{({z_h} - {z_j})}^2}} ,\textrm{ }(j = 1,2),$$
u1zh(x,y;x1,y1,z1;k) and u2zh(x,y;x2,y2,z2;k) are the complex amplitude distributions of the two waves on the image sensor plane, S(k) is the spectral intensity, i is the imaginary unit, ϕ is the phase distribution of the wave, and ϕpm(k) is the phase shift introduced by the phase modulator. Using Eqs. (8)–(11), I obtain
$$\begin{aligned} I(x,y) & = c\int_{ - \infty }^\infty {[|{u_1}({x_1},{y_1},{z_1};k){|^2} + |{u_2}({x_2},{y_2},{z_2};k){|^2} + 2|{u_1}({x_1},{y_1},{z_1};k)||{u_2}({x_2},{y_2},{z_2};k)|} \\ & {\times } \cos \{{k[{r_1}(x,y;{x_1},{y_1},{z_1}) - {r_2}(x,y;{x_2},{y_2},{z_2})] + {\phi_1}({x_1},{y_1},{z_1};k) - {\phi_2}({x_2},{y_2},{z_2};k) - {\phi_{pm}}(k)} \}]S(k)dk. \end{aligned}$$

 figure: Fig. 2.

Fig. 2. Optical implementation of MPH for multiwavelength 3D imaging and identification of the varieties of light.

Download Full Size | PDF

 figure: Fig. 3.

Fig. 3. Optical implementation of MPH. (a) Geometry of the optical system. (b) Transitions of the polarizations for the two waves.

Download Full Size | PDF

Here, I examine the case that, the optical path length of u1zh(x,y;x1,y1,z1;k) is the same as that of u2zh(x,y;x2,y2,z2;k) on the optical axis when a light point is placed on the optical axis. The optical-path-length difference between u1zh(x,y;x1,y1,z1;k) and u2zh(x,y;x2,y2,z2;k), which is generated by the center of the birefringent lens, is adjusted to satisfy the case described above, using the phase shift ϕp0(k), which is generated by the phase modulator, and the birefringent plate. In this case, Eq. (12) is rewritten as

$$\begin{aligned} I(x,y) & = c\int_{ - \infty }^\infty {[|{u_1}({x_1},{y_1},{z_1};k){|^2} + |{u_2}({x_2},{y_2},{z_2};k){|^2} + 2|{u_1}({x_1},{y_1},{z_1};k)||{u_2}({x_2},{y_2},{z_2};k)|} \\ & {\times} \cos \{{k[{{r_1}(x,y;{x_1},{y_1},{z_1}) - {r_2}(x,y;{x_2},{y_2},{z_2})} ]+ k({{z_1} - {z_2}} )- {\phi_p}(k)} \}]S(k)dk, \end{aligned}$$
where ϕp(k) = ϕpm(k) – ϕp0(k) is an additional phase shift generated by the phase modulator. When using a liquid crystal with thickness δ and refractive index change Δn to introduce an additional phase shift, I have
$${\phi _p}(k) = k\delta \Delta n.$$

Equation (13) is rewritten as

$$\begin{aligned} I(x,y) & = c\int_{ - \infty }^\infty {\{ |{u_1}({x_1},{y_1},{z_1};k){|^2} + |{u_2}({x_2},{y_2},{z_2};k){|^2} + 2|{u_1}({x_1},{y_1},{z_1};k)||{u_2}({x_2},{y_2},{z_2};k)|} \\ & {\times} \cos \left[ {k\left\{ {[{{r_1}(x,y;{x_1},{y_1},{z_1}) - {r_2}(x,y;{x_2},{y_2},{z_2})} ]+ ({{z_1} - {z_2}} )- \frac{{{\phi_p}(k)}}{k}} \right\}} \right]\} S(k)dk. \end{aligned} $$

Here I consider the spectral intensity S(k) and the spectral bandwidth of the light. I assume the case where S(k) is a constant within the bandwidth, its full width is 2Δk, the central wavenumber is kc, and |u1(x1,y1,z1;k)| and |u2(x2,y2,z2;k)| are independent of k in the full width. Then,

$$\begin{aligned} I(x,y) & = c^{\prime}\int_{{k_c} - \Delta k}^{{k_c} + \Delta k} {\{ |{u_{1}}({x_1},{y_1},{z_1}){|^2} + |{u_2}({x_2},{y_2},{z_2}){|^2} + 2|{u_1}({x_1},{y_1},{z_1})||{u_2}({x_2},{y_2},{z_2})|} \\ & {\times} \cos \left[ {k\left\{ {[{{r_1}(x,y;{x_1},{y_1},{z_1}) - {r_2}(x,y;{x_2},{y_2},{z_2})} ]+ ({{z_1} - {z_2}} )- \frac{{{\phi_p}(k)}}{k}} \right\}} \right]\} dk \\ & = c^{\prime}{I_0}(x,y) + c^{\prime}q(x,y,\Delta k,{\phi _p}){u_{zh}}(x,y)\exp [{ - i{\phi_p}({k_c})} ]\textrm{ } + C.C., \end{aligned}$$
where
$$q(x,y,\Delta k,{\phi _p}) = \textrm{sinc}\left[ {\Delta k\left\{ {[{{r_1}(x,y;{x_1},{y_1},{z_1}) - {r_2}(x,y;{x_2},{y_2},{z_2})} ]+ ({{z_1} - {z_2}} )- \frac{{{\phi_p}({k_c})}}{{{k_c}}}} \right\}} \right]\textrm{ ,}$$
$${u_{zh}}(x,y) = {I_1}(x,y)\exp [{i{k_c}\{{[{{r_1}(x,y;{x_1},{y_1},{z_1}) - {r_2}(x,y;{x_2},{y_2},{z_2})} ]+ ({{z_1} - {z_2}} )} \}} ].$$
I0(x, y) and I1(x, y) are the sums of the zeroth-order diffraction waves and the amplitudes of the interference fringes within the spectral bandwidth on the image sensor plane, respectively, and C.C. is the complex conjugate of the second term of the right-hand side of Eq. (16). The sinc function in Eq. (17) indicates that the visibility of interference fringes depends on Δk. The visibility is also dependent on ϕp(kc)/kc, and therefore, the visibility is controlled by an additional phase shift ϕp(k). In the case where S(k) is not constant within the bandwidth but based on a Gaussian function, I have
$$S(k) = \frac{{\sqrt {\ln 2} }}{{\Delta k\sqrt \pi }}\textrm{exp}\left\{ { - {{\left[ {\frac{{2\sqrt {\ln 2} (k - {k_c})}}{{2\Delta k}}} \right]}^2}} \right\}\textrm{,}$$
where 2Δk indicates the FWHM of the light. The center wavenumber of 2Δk is kc. I(x,y) in this case is expressed as
$$\begin{aligned} I(x,y) & = c\int_{ - \infty }^\infty {\{ |{u_1}({x_1},{y_1},{z_1};k){|^2} + |{u_2}({x_2},{y_2},{z_2};k){|^2} + 2|{u_1}({x_1},{y_1},{z_1};k)||{u_2}({x_2},{y_2},{z_2};k)|} \\ & {\times}\textrm{ }\cos \left[ {k\left\{ {[{{r_1}(x,y;{x_1},{y_1},{z_1}) - {r_2}(x,y;{x_2},{y_2},{z_2})} ]+ ({{z_1} - {z_2}} )- \frac{{{\phi_p}(k)}}{k}} \right\}} \right]\} S(k)dk \\ & = c^{\prime}{I_0}(x,y) + c^{\prime}q^{\prime}(x,y,\Delta k,{\phi _p}){u_{zh}}(x,y)\exp [{ - i{\phi_p}({k_c})} ]\textrm{ } + C.C., \end{aligned}$$
where
$$\displaystyle q^{\prime}(x,y,\Delta k,{\phi _p}) = \exp \left( { - {{\left[ {\frac{{2\Delta k}}{{2\sqrt {\ln 2} }}\left\{ {[{{r_1}(x,y;{x_1},{y_1},{z_1}) - {r_2}(x,y;{x_2},{y_2},{z_2})} ]+ ({{z_1} - {z_2}} )- \frac{{{\phi_p}({k_c})}}{{{k_c}}}} \right\}} \right]}^2}} \right)\textrm{ }.$$

Equations (17) and (21) indicate that the dependences on Δk, r1, r2, and ϕp(kc) are unchanged when S(k) is changed. I(x, y) is an interference fringe image generated from a point u(xo, yo, zo) on an object, and therefore, the hologram of an object h(x, y, Δk:ϕp(kc)) is expressed by

$$\begin{aligned} h(x,y,\Delta k:{\phi _p}({k_c})) & = \int\!\!\!\int\!\!\!\int {I(x,y)} \textrm{ }d{x_o}d{y_o}d{z_o}\\ \\ & = c^{\prime}\int\!\!\!\int\!\!\!\int {{I_0}(x,y)} \textrm{ }d{x_o}d{y_o}d{z_o} + c^{\prime}\int\!\!\!\int\!\!\!\int {q(x,y,\Delta k,{\phi _p}({k_c}))} \textrm{ }{u_{zh}}(x,y)\exp [{ - i{\phi_p}({k_c})} ]\textrm{ }d{x_o}d{y_o}d{z_o} + C.C.\textrm{,} \\ & = c^{\prime}\int\!\!\!\int\!\!\!\int {{I_0}(x,y)} \textrm{ }d{x_o}d{y_o}d{z_o} + U(x,y)\exp [{ - i{\phi_p}({k_c})} ]+ C.C.\textrm{,} \end{aligned}$$
$$U(x,y) = c^{\prime}\int\!\!\!\int\!\!\!\int {q(x,y,\Delta k,{\phi _p}({k_c}))\textrm{ }{u_{zh}}(x,y)} \textrm{ }d{x_o}d{y_o}d{z_o},$$
where U(x, y) contains 3D information of the object, based on the theory of SIDH, and denotes an object wave on the image sensor plane. I(x, y) is a function of (x1, y1, z1) and (x2, y2, z2) as described in Eq. (8). Both (x1, y1, z1) and (x2, y2, z2) are functions of (xo, yo, zo) as described in Eqs. (1) and (2). Therefore, I(x, y) is a function of (xo, yo, zo). The second term is extracted by PSI. In multidimension-multiplexed imaging, holograms obtained with different wavelengths and varieties of light are incoherently superimposed on the image sensor plane. Therefore, the incoherent summation of M holograms is expressed as ${H_N}(x,y) = \sum\limits_{m = 1}^M {{h_m}(x,y,\Delta {k_m}:{\phi _{p(N - 1)}}({k_m}))}$, where N denotes the number of exposures to record multiplexed holograms with phase shifts. N phase-shifted multiplexed holograms is expressed using the following matrix equation:
$${\mathbf H} = \mathbf{\Phi} {\mathbf U},$$
$${\mathbf H} = \left( \begin{array}{c} {H_1}(x,y)\\ {H_2}(x,y)\\ {H_3}(x,y)\\ \cdot \\ \cdot \\ \cdot \\ {H_N}(x,y) \end{array} \right),$$
$${\mathbf \Phi } = \left( {\begin{array}{cccccccc} 1&1&0&{}&{}&{}&1&0\\ 1&{\cos [{{\phi_{p1}}({k_1})} ]}&{ - \sin [{{\phi_{p1}}({k_1})} ]}&\cdot &\cdot &\cdot &{\cos [{{\phi_{p1}}({k_M})} ]}&{ - \sin [{{\phi_{p1}}({k_M})} ]}\\ 1&{\cos [{{\phi_{p2}}({k_1})} ]}&{ - \sin [{{\phi_{p2}}({k_1})} ]}&{}&{}&{}&{\cos [{{\phi_{p2}}({k_M})} ]}&{ - \sin [{{\phi_{p2}}({k_M})} ]}\\ {}&\cdot &{}&\cdot &{}&{}&{}&\cdot \\ {}&\cdot &{}&{}&\cdot &{}&{}&\cdot \\ {}&\cdot &{}&{}&{}&\cdot &{}&\cdot \\ 1&{\cos [{{\phi_{p(N - 1)}}({k_1})} ]}&{ - \sin [{{\phi_{p(N - 1)}}({k_1})} ]}&\cdot &\cdot &\cdot &{\cos [{{\phi_{p(N - 1)}}({k_M})} ]}&{ - \sin [{{\phi_{p(N - 1)}}({k_M})} ]} \end{array}} \right),$$
$${\mathbf U} = \left( \begin{array}{c} {c_0}^{\prime}\int\!\!\!\int\!\!\!\int {{I_0}(x,y)d{x_o}d{y_o}d{z_o}} \\ Re \left[ {{c_1}^{\prime}\int\!\!\!\int\!\!\!\int {{q_1}(x,y,\Delta {k_1},{\phi_p}({k_1})){u_{zh1}}(x,y)d{x_o}d{y_o}d{z_o}} } \right]\\ {\mathop{\rm Im}\nolimits} \left[ {{c_1}^{\prime}\int\!\!\!\int\!\!\!\int {{q_1}(x,y,\Delta {k_1},{\phi_p}({k_1})){u_{zh1}}(x,y)d{x_o}d{y_o}d{z_o}} } \right]\\ \cdot \\ \cdot \\ \cdot \\ Re \left[ {{c_M}^{\prime}\int\!\!\!\int\!\!\!\int {{q_M}(x,y,\Delta {k_M},{\phi_p}({k_M})){u_{zhM}}(x,y)d{x_o}d{y_o}d{z_o}} } \right]\\ {\mathop{\rm Im}\nolimits} \left[ {{c_M}^{\prime}\int\!\!\!\int\!\!\!\int {{q_M}(x,y,\Delta {k_M},{\phi_p}({k_M})){u_{zhM}}(x,y)d{x_o}d{y_o}d{z_o}} } \right] \end{array} \right).$$

The matrix U indicates object waves containing different spectral bandwidths and/or varieties of light. N ≥ 2M+1 is generally required owing to the number of variables. From the matrix equation, I find that there are two approaches to derive object waves selectively. As the first approach, I use the inverse matrix of Φ, when different phase shifts can be set to respective object waves and a regular matrix Φ is obtained. The respective object waves are derived using the following matrix equation:

$${\mathbf U} = {{\mathbf \Phi }^{ - 1}}{\mathbf H}\textrm{.}$$

This approach is effective for the case where common U for phase-shifted holograms is obtained. This case means that q(x,y,Δk,ϕp) is approximately constant during phase shifts. The increase in the number of the phase shifts is effective to reduce the error as researched in PSI.

As increasing Δk and setting large phase shifts, it becomes difficult to obtain common U for phase-shifted holograms. This situation has not been seen in laser digital holography [20] but is a problem in incoherent digital holography with temporally incoherent light. However, another approach is provided by exploiting the change of q(x,y,Δk,ϕp) during phase shifts, which is owing to temporal coherency difference between different varieties of light. I consider the case where illumination light transmitted from an object and self-luminous light generated from another object contain the same wavelengths and their spectral bandwidths partly overlap. I exploit the temporal coherency difference between the waves of these objects for this case. The phase shifts and amplitudes of the interference fringes for different varieties of light waves are closely related, as indicated by Eqs. (17) and (21)-(23). The temporal coherency of reflection/transmission light is controlled by preparing a light source such as an LED with a bandpass filter. On the other hand, self-luminous light such as fluorescence light has a wider wavelength band and weaker light intensity than reflection/transmission light obtained with band-limited illumination light. q(x,y,Δk,ϕp) represents the attenuation of an object wave owing to Δk and ϕp based on temporal coherency. Different object waves contain different coherence lengths, and these waves are separated using the temporal coherency gate and signal processing of PSI. I assume the case where two object waves U1(x, y) and U2(x, y) with different spectral bandwidths 2Δk1 and 2Δk2, respectively, form self-interference holograms, with Δk2 much larger than Δk1. Initially, I introduce a large phase shift ϕpL so that q2(x, y, Δk2, ϕpL) becomes approximately zero on the image sensor plane. Interference fringes generated by U1(x, y) still appear owing to the difference in temporal coherency. For simplicity, I consider the case with M = 2 varieties, N = 8 exposures, ϕpj < 2π for four holograms H1, H2, H3, and H4,, ϕpL ≤ ϕpj < (ϕpL + 2π) for four holograms H5, H6, H7, and H8, q1(x, y, Δk1, ϕp < 2π) = 1, q2(x, y, Δk2, ϕp < 2π) = Γ2 (Γ ≤ 1), q1(x, y, Δk1, ϕpL ϕp < ϕpL + 2π) = Γ1, and q2(x, y,Δk2, ϕpLϕp) = Γ0 = 0. Then, Eqs. (25)–(27) are rewritten as

$${\mathbf H} = \left( \begin{array}{c} {H_1}(x,y)\\ {H_2}(x,y)\\ {H_3}(x,y)\\ {H_4}(x,y)\\ {H_5}(x,y)\\ {H_6}(x,y)\\ {H_7}(x,y)\\ {H_8}(x,y) \end{array} \right),$$
$${\mathbf \Phi } = \left( {\begin{array}{ccccc} 1&1&0&1&0\\ 1&{\cos [{{\phi_{p1}}({k_1})} ]}&{ - \sin [{{\phi_{p1}}({k_1})} ]}&{{\varGamma_2}\cos [{{\phi_{p1}}({k_2})} ]}&{ - {\varGamma_2}\sin [{{\phi_{p1}}({k_2})} ]}\\ 1&{\cos [{{\phi_{p2}}({k_1})} ]}&{ - \sin [{{\phi_{p2}}({k_1})} ]}&{{\varGamma_2}\cos [{{\phi_{p2}}({k_2})} ]}&{ - {\varGamma_2}\sin [{{\phi_{p2}}({k_2})} ]}\\ 1&{\cos [{{\phi_{p3}}({k_1})} ]}&{ - \sin [{{\phi_{p3}}({k_1})} ]}&{{\varGamma_2}\cos [{{\phi_{p3}}({k_2})} ]}&{ - {\varGamma_2}\sin [{{\phi_{p3}}({k_2})} ]}\\ 1&{{\varGamma_1}\cos [{{\phi_{p4}}({k_1})} ]}&{ - {\varGamma_1}\sin [{{\phi_{p4}}({k_1})} ]}&\textrm{0}&0\\ 1&{{\varGamma_1}\cos [{{\phi_{p5}}({k_1})} ]}&{ - {\varGamma_1}\sin [{{\phi_{p5}}({k_1})} ]}&0&0\\ 1&{{\varGamma_1}\cos [{{\phi_{p6}}({k_1})} ]}&{ - {\varGamma_1}\sin [{{\phi_{p6}}({k_1})} ]}&0&0\\ 1&{{\varGamma_1}\cos [{{\phi_{p7}}({k_1})} ]}&{ - {\varGamma_1}\sin [{{\phi_{p7}}({k_1})} ]}&0&0 \end{array}} \right),$$
$${\mathbf U} = \left( \begin{array}{c} {c_0}^{\prime}\int\!\!\!\int\!\!\!\int {{I_0}(x,y)d{x_o}d{y_o}d{z_o}} \\ Re \left[ {{c_1}^{\prime}{{\int\!\!\!\int\!\!\!\int {{u_{zh1}}(x,y)d{x_o}d{y_o}dz} }_o}} \right]\\ {\mathop{\rm Im}\nolimits} \left[ {{c_1}^{\prime}\int\!\!\!\int\!\!\!\int {{u_{zh1}}(x,y)d{x_o}d{y_o}d{z_o}} } \right]\\ Re \left[ {{c_2}^{\prime}\int\!\!\!\int\!\!\!\int {{u_{zh2}}(x,y)d{x_o}d{y_o}d{z_o}} } \right]\\ {\mathop{\rm Im}\nolimits} \left[ {{c_2}^{\prime}\int\!\!\!\int\!\!\!\int {{u_{zh2}}(x,y)d{x_o}d{y_o}d{z_o}} } \right] \end{array} \right).$$

An object wave U1(x, y) on the image sensor plane is derived by solving Eq. (24) with Eqs. (29)–(31):

$$\begin{aligned} {U_1}^{\prime}(x,y) & = {\varGamma _1}\exp ( - i{\phi _{pL}})\left\{ {Re \left[ {{c_1}^{\prime}\int\!\!\!\int\!\!\!\int {{u_{zh1}}(x,y)d{x_o}d{y_o}d{z_o}} } \right] + i{\mathop{\rm Im}\nolimits} \left[ {{c_1}^{\prime}\int\!\!\!\int\!\!\!\int {{u_{zh1}}(x,y)d{x_o}d{y_o}d{z_o}} } \right]} \right\} \\ & = {H_5}(x,y) - {H_7}(x,y) + i[{{H_6}(x,y) - {H_8}(x,y)} ] \\ & = H(x,y:{\phi _{pL}}) - H(x,y:{\phi _{pL}} + \pi ) + i[{H(x,y:{\phi_{pL}} + \pi /2) - H(x,y:{\phi_{pL}} + 3\pi /2)} ]. \end{aligned}$$
U1(x, y) is numerically obtained from U1’(x, y) because U1’(x, y) is modulated by the known large phase shift ϕpL:
$${U_1}(x,y) = \frac{{{U_1}^{\prime}(x,y)}}{{{\varGamma _1}}}\exp (i{\phi _{pL}}).$$
Γ1 is derived in advance from the average of ${q_1}(x,y,\Delta {k_1},{\phi _{pL}})$ on the image sensor plane for a point object placed at a 3D position. Otherwise, Γ is obtained from a test object in advance. Finally, U2(x, y) is derived using PSI and numerically derived from U1(x, y) as follows:
$$\begin{aligned} {U_2}(x,y) & = ({H_1}(x,y) - {H_3}(x,y) + i[{{H_2}(x,y) - {H_4}(x,y)} ]\\ & - [{\{{1 - \cos [{{\phi_{p2}}({{k_1}} )} ]} \}+ i\{{\cos [{{\phi_{p1}}({{k_1}} )} ]- \cos [{{\phi_{p3}}({{k_1}} )} ]} \}} ]Re [{U_1}(x,y)]\\ & - [{\sin [{{\phi_{p2}}({{k_1}} )} ]+ i\{{\sin [{{\phi_{p3}}({{k_1}} )} ]- \sin [{{\phi_{p1}}({{k_1}} )} ]} \}} ]{\mathop{\rm Im}\nolimits} [{U_1}(x,y)])[{1/(2{\varGamma_2})} ] \\ & = (H(x,y:{\phi _p}({k_2}) = 0) - H(x,y:{\phi _p}({k_2}) = \pi ) + i[{H(x,y:{\phi_p}({k_2}) = \pi /2) - H(x,y:{\phi_p}({k_2}) = 3\pi /2)} ] \\ & - [{\{{1 - \cos [{{\phi_{p2}}({{k_1}} )} ]} \}+ i\{{\cos [{{\phi_{p1}}({{k_1}} )} ]- \cos [{{\phi_{p3}}({{k_1}} )} ]} \}} ]Re [{U_1}(x,y)] \\ & - [{\sin [{{\phi_{p2}}({{k_1}} )} ]+ i\{{\sin [{{\phi_{p3}}({{k_1}} )} ]- \sin [{{\phi_{p1}}({{k_1}} )} ]} \}} ]{\mathop{\rm Im}\nolimits} [{U_1}(x,y)])[{1/(2{\varGamma_2})} ]. \end{aligned}$$

In the special case where ${\phi _p}({k_1})$ is approximately equal to ${\phi _p}({k_2})$ since they have nearly the same peak wavelength, U1(x, y) + Γ2U2(x, y) is obtained by PSI. When using four-step PSI,

$$\begin{aligned} {U_2}(x,y) & = \frac{{{H_1}(x,y) - {H_3}(x,y) + i[{{H_2}(x,y) - {H_4}(x,y)} ]- {U_1}(x,y)}}{{\textrm{2}{\varGamma _2}}} \\ & = \frac{{H(x,y:{\phi _p} = 0) - H(x,y:{\phi _p} = \pi ) + i[{H(x,y:{\phi_p} = \pi /2) - H(x,y:{\phi_p} = 3\pi /2)} ]- {U_1}(x,y)}}{{\textrm{2}{\varGamma _2}}}. \end{aligned}$$
Γ2 can be ignored or be set as 1 because it is regarded as the constant coefficient. As a result, I can distinguish the varieties of light even when fluorescence light with same wavelengths forms its hologram simultaneously. After that, numerical calculations of wave propagations are applied for U1(x, y) and U2(x, y) and then focused images of 3D objects are reconstructed. The matrix Φ is designed with modulating Γ and ${\phi _p}({k_m})$. This algorithm can be applied for the case of Γ0 ≠ 0 because the matrix equation is solved by introducing different values for Γ0, Γ1 and Γ2. These modulations are realized simply by exploiting the phase shifts even when M is increased.

3. Experiments

I conducted experiments to show the validity of MPH. Initially, I demonstrate wavelength-multiplexed 3D imaging of a reflective color 3D object using the portable hologram recorder of MPH. Figure 4 shows photographs of the developed palm-sized hologram recorder and the experimental setup. The whole interferometer was set on a wood table to show its color 3D imaging ability without an antivibration table. I used red and green LEDs with nominal wavelengths of 625 and 530 nm, respectively, as light sources mounted in a four-wavelength LED head (LED4D201, Thorlabs). A monochrome image sensor with 2592×1944 pixels, the pixel pitch of 2.2 µm, and 12-bit resolution was used. The dynamic range of the image sensor was not wide for recording a multiplexed incoherent digital hologram of an object with a complex structure, and the summation of 32 images was set as a phase-shifted multiplexed incoherent digital hologram at each phase shift, in the first experiment. A miniature model of a canard was set as a colored 3D object. This object was illuminated by red and green LED light simultaneously, and its diffraction light was introduced to the hologram recorder. Birefringent crystal lenses and plates and polarizers were set to generate a wavelength-multiplexed self-interference hologram. A multi-order liquid crystal phase modulator (LC-PM) (LCC2415-VIS, Thorlabs) was adopted for MPH. The LC-PM was used to shift the phases of the self-interference holograms. Using the LC-PM, wavelength-dependent phase shifts were successfully generated, and large phase shifts could be introduced. The amount of the phase modulation for 2π phase shift was searched in advance for each wavelength band of each LED.

 figure: Fig. 4.

Fig. 4. Experimental setup. (a) Developed palm-sized multidimension-multiplexed hologram recorder based on MPH. (b) Optical setup for MPH.

Download Full Size | PDF

Regular phase shifts 2π/5 and π/2 were introduced to self-interference light generated from red and green LEDs in each exposure, respectively. Seven wavelength-multiplexed phase-shifted self-interference holograms were recorded. Therefore, Φ was the 5 × 7 matrix. A color 3D image of the object was reconstructed from the recorded holograms by MPH. Figure 5 shows the experimental results. Figures 5(a) and 5(b) are photographs taken with red and green LEDs, respectively, and Fig. 5(c) is the color-synthesized image. Figure 5(d) shows one of the recorded wavelength-multiplexed self-interference holograms. 3D and wavelength information of the object was recorded simultaneously. Figures 5(e) and 5(f) are the red- and green-channel images reconstructed by MPH and Fig. 5(g) is the color-synthesized image. Figures 5(e)–5(g) indicate that the focused image of the object was obtained at different wavelength bands and its wavelength characteristics were also successfully reconstructed. The recorder was moved close to the object, then high-spatial-frequency information of the object was recorded in a self-interference wavelength-multiplexed hologram shown in Fig. 5(h). The resolution was improved using such holograms, as shown in Fig. 5(i).

 figure: Fig. 5.

Fig. 5. Experimental results for wavelength-multiplexed 3D imaging with the developed hologram recorder. Photographs of the colored object taken with (a) red LED and (b) green LED. (c) Color-synthesized image of (a) and (b). (d) One of the recorded holograms. Reconstructed images of the object illuminated by (e) red LED and (f) green LED. (g) Color-synthesized image of (e) and (f). The numerical propagation distance used to obtain the image of (g) was 155 mm. (h) One of the recorded holograms after moving the hologram recorder close to the object. (i) Color-synthesized image obtained from (h) and other recorded holograms. The numerical propagation distance used to obtain the image of (i) was 81 mm.

Download Full Size | PDF

I conducted another experiment to show the simultaneous 3D sensing of self-luminous light and diffraction light of illumination light. Figure 6(a) illustrates a schematic of the experimental setup. In this experiment, I examined the case where the wavelength bandwidths of the two varieties of light overlap. A self-luminous object was illuminated by an ultraviolet LED whose central wavelength was 365 nm to generate blue fluorescence light, and another object was illuminated with a blue LED to obtain blue transmission light of a 1 mm aperture used as another object. A block of tin halide perovskite nanocrystal containing metal complex molecules was set as the blue luminescent material. The spectral intensity distribution of its fluorescence light was obtained with a spectrometer in advance and is shown in Fig. 6(b). The peak wavelength of the material was 451 nm. The FWHM of the luminescence was 65 nm and the luminescent wavelength ranged from 401 to 521 nm; at these two values, the intensity was one-tenth of that at 451 nm, giving a luminescent wavelength width of 120 nm. A blue LED with a nominal wavelength of 455 nm, which was mounted in a four-wavelength LED head (LED4D201, Thorlabs), was used as the illumination-light source. The peak wavelength of the LED was between 450 and 455 nm and the FWHM was 18 nm. A bandpass-filter whose transmission bandwidth was 446–468 nm was inserted between the blue LED and the aperture to improve the temporal coherency of the illumination light. Each variety of light contained the same wavelength and the wavelength bandwidth of LED light was fully overlapped by that of fluorescence light. Furthermore, the difference between their peak wavelengths was within 5 nm. The self-interference multiplexed hologram shown in Fig. 6(c) was recorded and then multiple phase shifts and recordings were repeated to obtain 3D information of each variety of light and to distinguish them. Equations (32), (33), and (35) were used to distinguish each variety of light because their peak wavelengths were approximately equal. Therefore, Φ was the 5 × 8 matrix. In this experiment the number of exposures is one per phase shift and eight exposures were enough to perform MPH. Γ0, Γ1, Γ2, and ϕpL were set as 0, 0.65, 1, and, 15.4π for λ = 445 nm, respectively, in this experiment. Figures 6(d)–6(g) show the experimental results. The 3D information of each variety of light was retrieved with the commonly used four-step PSI, as shown in Figs. 6(d) and 6(e). However, it was difficult to detect the different varieties of light separately. In contrast, both 3D imaging and selective imaging of the varieties of light were performed using MPH, as shown in Figs. 6(f) and 6(g). Thus, MPH was experimentally demonstrated and successfully validated. In this experiment I set that the self-luminous object and the aperture were separated in both in-plane and depth directions. However, even in the case where the two objects are overlapped in the in-plane direction, the varieties of light diffracted from the objects are selectively detected in principle. Practically, signal-to-noise ratio and dynamic range of the image sensor used should be taken care for such the case.

 figure: Fig. 6.

Fig. 6. Experimental results for light-multiplexed 3D imaging with the developed hologram recorder. (a) Schematic of experimental setup. (b) Spectral intensity of the self-luminous object. (c) One of the recorded holograms. Left and right Gabor zone plate patterns were generated with LED light and fluorescence light, respectively. (d), (e) Images reconstructed by commonly applied four-step PSI. The depth difference between (d) and (e) corresponded to 35 mm in the object plane. (f) LED light and (g) fluorescence light images of the objects. The numerical propagation distances of (f) and (g) were the same as those of (d) and (e), respectively.

Download Full Size | PDF

4. Discussions and conclusions

Multiwavelength 3D imaging and simultaneous 3D sensing of different varieties of light containing the same wavelengths have been experimentally demonstrated. MPH is also applicable for sensing of other physical quantities of light by modifying an optical implementation. In simultaneous polarimetric imaging with MPH, a Sagnac interferometer with polarization-dependent phase shifts is considered as an optical implementation. Polarimetric information is acquired when a common-path or a two-arm self-interference holography system with a polarimetric phase shifter is set. Varieties of light are selectively identified also when the wavelengths or polarization states are different. Furthermore, simultaneous quantitative phase imaging will also be achieved by adopting self-reference holography [35] for MPH. Quantitative phase information can be obtained using an interferometer with LED light, and therefore fluorescence 3D image and quantitative phase image will be simultaneously obtained with a single monochrome image sensor, using MPH. Constructions of optical systems described above are future works.

Various types of multiwavelength SIDH techniques have been proposed until now; the use of the Fourier spectroscopy and the Michelson [14,15,36] or the Mach-Zehnder [37] interferometer, the use of multiple wavelength filters and sequential and separative recording of holograms at respective wavelength bands [3840], the use of the differences of PSFs at respective wavelength bands [17,31,41], and the use of a color image sensor and a two-arm [13] or a single-path [4244] interferometer. Hyperspectral holographic imaging has been achieved with the Fourier spectroscopy [14,15,36,37]. However, a large number of recordings of multiplexed holograms have been required to date even when the number of wavelength bands is small. Temporally divided recording of wavelength information [31,3840] is a straightforward method and there is no problem for overlapping of wavelength information. On the other hand, it costs time to change wavelength filters. A diffractive lens such as a Fresnel phase lens has wavelength dependency and therefore wavelength filters have been required to avoid the generation of multi-order diffraction light. To solve the problem, simultaneous recording of multiple wavelength information has been performed using the PSF difference of a diffractive lens for respective wavelength bands [17,31,41]. However, PSF libraries at respective wavelength bands in 3D space are required for wavelength separation. The use of a color image sensor is another straightforward method. Multiple wavelength information is simultaneously recorded. Full-color holographic 3D imaging of natural light [13] and full-color 3D motion-picture recording of white light [44] have been performed. The recordable space-bandwidth product is decreased by using a color image sensor containing a color-filter array. Therefore, the resolution and/or the field of view is decreased in principle. MPH can conduct two-wavelength-band holographic imaging of incoherent light with a smaller number of recordings in comparison to SIDH with the Fourier spectroscopy. Five exposures are enough to conduct two-color holographic incoherent imaging in principle. Sequential changes of wavelength filters are not required. Acquisition of PSF libraries in 3D space is not required. Recordable space-bandwidth product is not sacrificed to record multiwavelength information. Moreover, it is difficult to identify the varieties of light when the wavelength bands of illumination light and self-luminous light or natural light are severely overlapped, for conventional SIDH techniques. MPH enables to detect the varieties of light without active modulation of illumination light even when the wavelength bands are overlapped, and their peak wavelength is close to each other. The peak wavelength difference was within 5 nm in the experimental demonstration. However, the variety of the light can be separated even for the case where their peak wavelengths are the same in principle.

MPH with the presented optical implementation requires multiple exposures. Five exposures are enough in each experiment in the condition where common U for phase-shifted holograms is obtained. This is because there are five variables in each hologram for such the condition. In the same manner, the number of exposures N = 2M+1 is the minimal theoretical number for the number of holograms M. However, there were sources that caused the error in PSI with incoherent light and the number of phase shifts were seven and eight for respective experiments to suppress the error in PSI. It is considered that the error sources are related to the slight change of $q(x,y,\Delta k,{\phi _p})$ during phase shifts. Then, in the theory, each Δn for each wavelength band 2Δk is assumed as constant. Dispersion of the LC-PM is considered as an error source for PSI. The increase of the number of phase shifts was effective for the suppression of the error in PSI. On the other hand, temporal information capacity is decreased to suppress the error. Detailed analyses for the noise sources and modifications of the optical setup for the acceleration of measurement speed will be next research themes.

I have proposed multidimension-multiplexed holography exploiting the encode with phase shifts, termed MPH, to obtain multiple physical quantities and varieties of light. SIDH and PSI are used to implement MPH. A palm-sized multidimension-multiplexed hologram recorder based on MPH was constructed. Experimental demonstrations were performed with the hologram recorder. MPH enables us to detect 3D information of multiple physical quantities and varieties of light without an optical filter and will be useful for the advanced analyses of specimens in microscopy, machine vision as holographic machine vision, and healthcare in our daily lives.

Funding

Mitsubishi Foundation (202111007); Precursory Research for Embryonic Science and Technology (JPMJPR16P8); Japan Society for the Promotion of Science (JP18H01456); Cooperative Research Program of “Network Joint Research Center for Materials and Devices” (20211086).

Acknowledgments

The author sincerely appreciates the reviewers for improving the manuscript by careful reading of the manuscript and kind advices. The author also thanks Prof. Ayumi Ishii for providing metal complex molecules of tin halide perovskite nanocrystal and Prof. Yuichi Kozawa for helpful discussions.

Disclosures

The author declares no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the author upon reasonable request.

References

1. S. Kawata, O. Nakamura, T. Noda, H. Ooki, K. Ogino, Y. Kuroiwa, and S. Minami, “Laser computed-tomography microscope,” Appl. Opt. 29(26), 3805–3809 (1990). [CrossRef]  

2. Y. K. Park, G. Popescu, K. Badizadegan, R. R. Dasari, and M. S. Feld, “Diffraction phase and fluorescence microscopy,” Opt. Express 14(18), 8263–8268 (2006). [CrossRef]  

3. N. Pavillon, A. Benke, D. Boss, C. Moratal, J. Kühn, P. Jourdain, C. Depeursinge, P. J. Magistretti, and P. Marquet, “Cell morphology and intracellular ionic homeostasis explored with a multimodal approach combining epifluorescence and digital holographic microscopy,” J. Biophotonics 3(7), 432–436 (2010). [CrossRef]  

4. A. Vijayakumar, Y. Kashter, R. Kelner, and J. Rosen, “Coded aperture correlation holography–a new type of incoherent digital holograms,” Opt. Express 24(11), 12430–12441 (2016). [CrossRef]  

5. J. Wu, H. Zhang, W. Zhang, G. Jin, L. Cao, and G. Barbastathis, “Single-shot lensless imaging with Fresnel zone aperture and incoherent illumination,” Light: Sci. Appl. 9(1), 53 (2020). [CrossRef]  

6. K. R. Lee and Y. K. Park, “Exploiting the speckle-correlation scattering matrix for a compact reference-free holographic image sensor,” Nat. Commun. 7(1), 13359 (2016). [CrossRef]  

7. N. Antipa, G. Kuo, R. Heckel, B. Mildenhall, E. Bostan, R. Ng, and L. Waller, “DiffuserCam: lensless single-exposure 3D imaging,” Optica 5(1), 1–9 (2018). [CrossRef]  

8. J. H. Bruning, D. R. Herriott, J. E. Gallagher, D. P. Rosenfeld, A. D. White, and D. J. Brangaccio, “Digital wavefront measuring interferometer for testing optical surfaces and lenses,” Appl. Opt. 13(11), 2693–2703 (1974). [CrossRef]  

9. M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transform method of fringe-pattern analysis for computer-based topography and interferometry,” J. Opt. Soc. Am. 72(1), 156–160 (1982). [CrossRef]  

10. M. K. Kim, ed., Digital Holographic Microscopy: Principles, Techniques, and Applications (Springer, 2011).

11. P. Picart and J.-C. Li, eds., Digital Holography (Wiley, 2013).

12. T.-C. Poon and J.-P. Liu, eds., Introduction to Modern Digital Holography with MATLAB (Cambridge University, 2014).

13. M. K. Kim, “Full color natural light holographic camera,” Opt. Express 21(8), 9636–9642 (2013). [CrossRef]  

14. K. Yoshimori, “Interferometric spectral imaging for three-dimensional objects illuminated by a natural light source,” J. Opt. Soc. Am. A 18(4), 765–770 (2001). [CrossRef]  

15. M. Liebel, N. P. Perez, N. F. Hulst, and R. A. A. Puebla, “Surface-enhanced Raman scattering holography,” Nat. Nanotechnol. 15(12), 1005–1011 (2020). [CrossRef]  

16. J. Kühn, T. Colomb, F. Montfort, F. Charrière, Y. Emery, E. Cuche, P. Marquet, and C. Depeursinge, “Real-time dual-wavelength digital holographic microscopy with a single hologram acquisition,” Opt. Express 15(12), 7231–7242 (2007). [CrossRef]  

17. A. Vijayakumar and J. Rosen, “Spectrum and space resolved 4D imaging by coded aperture correlation holography (COACH) with diffractive objective lens,” Opt. Lett. 42(5), 947–950 (2017). [CrossRef]  

18. S. K. Sahoo, D. Tang, and C. Dang, “Single-shot multispectral imaging with a monochromatic camera,” Optica 4(10), 1209–1213 (2017). [CrossRef]  

19. X. Li, J. A. Greenberg, and M. E. Gehm, “Single-shot multispectral imaging through a thin scatterer,” Optica 6(7), 864–871 (2019). [CrossRef]  

20. T. Tahara, R. Mori, S. Kikunaga, Y. Arai, and Y. Takaki, “Dual-wavelength phase-shifting digital holography selectively extracting wavelength information from wavelength-multiplexed holograms,” Opt. Lett. 40(12), 2810–2813 (2015). [CrossRef]  

21. Y. Ohtsuka and K. Oka, “Contour mapping of the spatiotemporal state of polarization of light,” Appl. Opt. 33(13), 2633–2636 (1994). [CrossRef]  

22. M. Takeda and M. Kito, “Spatiotemporal frequency multiplex heterodyne interferometry,” J. Opt. Soc. Am. A 9(9), 1607–1614 (1992). [CrossRef]  

23. E. Shaffer, N. Pavillon, and C. Depeursinge, “Single-shot, simultaneous incoherent and holographic microscopy,” J. Microscopy 245(1), 49–62 (2012). [CrossRef]  

24. G. Sirat and D. Psaltis, “Conoscopic holography,” Opt. Lett. 10(1), 4–6 (1985). [CrossRef]  

25. L. Mugnier and G. Sirat, “On-axis conoscopic holography without a conjugate image,” Opt. Lett. 17(4), 294–296 (1992). [CrossRef]  

26. M. Takeda, W. Wang, Z. Duan, and Y. Miyamoto, “Coherence holography,” Opt. Express 13(23), 9629–9635 (2005). [CrossRef]  

27. J. Rosen and G. Brooker, “Digital spatially incoherent Fresnel holography,” Opt. Lett. 32(8), 912–914 (2007). [CrossRef]  

28. R. Kelner, B. Katz, and J. Rosen, “Optical sectioning using a digital Fresnel incoherent-holography-based confocal imaging system,” Optica 1(2), 70–74 (2014). [CrossRef]  

29. J.-P. Liu, T. Tahara, Y. Hayasaki, and T.-C. Poon, “Incoherent digital holography: a review,” Appl. Sci. 8(1), 143 (2018). [CrossRef]  

30. T. Nobukawa, Y. Katano, T. Muroi, N. Kinoshita, and N. Ishii, “Bimodal incoherent digital holography for both three-dimensional imaging and quasi-infinite-depth-of-field imaging,” Sci. Rep. 9(1), 3363 (2019). [CrossRef]  

31. J. Rosen, A. Vijayakumar, M. Kumar, M. R. Rai, R. Kelner, Y. Kashter, A. Bulbul, and S. Mukherjee, “Recent advances in self-interference incoherent digital holography,” Adv. Opt. Photonics 11(1), 1–66 (2019). [CrossRef]  

32. J. Rosen, S. Alford, A. Vijayakumar, et al., “Roadmap on recent progress in FINCH technology,” J. Imaging 7(10), 197 (2021). [CrossRef]  

33. T. Tahara, “Review of incoherent digital holography: applications to multidimensional incoherent digital holographic microscopy and palm-sized digital holographic recorder – Holosensor,” Front. Photonics 2, 829139 (2022). [CrossRef]  

34. T. Tahara, Y. Zhang, J. Rosen, A. Vijayakumar, L. Cao, J. Wu, T. Koujin, A. Matsuda, A. Ishii, Y. Kozawa, R. Okamoto, R. Oi, T. Nobukawa, K. Choi, M. Imbe, and T.-C. Poon, “Roadmap of incoherent digital holography,” Appl. Phys. B (submitted).

35. T. Tahara, Y. Kozawa, and R. Oi, “Single-path single-shot phase-shifting digital holographic microscopy without a laser light source,” Opt. Express 30(2), 1182–1194 (2022). [CrossRef]  

36. S. Teeranutranont and K. Yoshimori, “Digital holographic three-dimensional imaging spectrometry,” Appl. Opt. 52(1), A388–A396 (2013). [CrossRef]  

37. D. N. Naik, G. Pedrini, M. Takeda, and W. Osten, “Spectrally resolved incoherent holography: 3D spatial and spectral imaging using a Mach-Zehnder radial-shearing interferometer,” Opt. Lett. 39(7), 1857–1860 (2014). [CrossRef]  

38. J. Rosen and G. Brooker, “Fluorescence incoherent color holography,” Opt. Express 15(5), 2244–2250 (2007). [CrossRef]  

39. Y. Wan, T. Man, and D. Wang, “Incoherent off-axis Fourier triangular color holography,” Opt. Express 22(7), 8565–8573 (2014). [CrossRef]  

40. C. M. Nguyen, D. Muhammad, and H.-S. Kwon, “Spatially incoherent common-path off-axis color digital holography,” Appl. Opt. 57(6), 1504–1509 (2018). [CrossRef]  

41. V. Anand, S. H. Ng, J. Maksimovic, D. Linklater, T. Katkus, E. P. Ivanova, and S. Juodkazis, “Single shot multispectral multidimensional imaging using chaotic waves,” Sci. Rep. 10(1), 13902 (2020). [CrossRef]  

42. K. Choi, J. Yum, and S.-W. Min, “Achromatic phase shifting self-interference incoherent digital holography using linear polarizer and geometric phase lens,” Opt. Express 26(13), 16212–16225 (2018). [CrossRef]  

43. T. Tahara and I. Sato, “Single-shot color digital holographic microscopy with white light,” in Proceedings of 3D Image Conf. 2019, pp. 1–4 (4 pages) (2019). (in Japanese)

44. T. Tahara, T. Ito, Y. Ichihashi, and R. Oi, “Single-shot incoherent color digital holographic microscopy system with static polarization-sensitive optical elements,” J. Opt. 22(10), 105702 (2020). [CrossRef]  

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the author upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1. Basic concept of MPH. (a) Schematic of recording multidimensional information. (b) Flow of image-reconstruction procedures.
Fig. 2.
Fig. 2. Optical implementation of MPH for multiwavelength 3D imaging and identification of the varieties of light.
Fig. 3.
Fig. 3. Optical implementation of MPH. (a) Geometry of the optical system. (b) Transitions of the polarizations for the two waves.
Fig. 4.
Fig. 4. Experimental setup. (a) Developed palm-sized multidimension-multiplexed hologram recorder based on MPH. (b) Optical setup for MPH.
Fig. 5.
Fig. 5. Experimental results for wavelength-multiplexed 3D imaging with the developed hologram recorder. Photographs of the colored object taken with (a) red LED and (b) green LED. (c) Color-synthesized image of (a) and (b). (d) One of the recorded holograms. Reconstructed images of the object illuminated by (e) red LED and (f) green LED. (g) Color-synthesized image of (e) and (f). The numerical propagation distance used to obtain the image of (g) was 155 mm. (h) One of the recorded holograms after moving the hologram recorder close to the object. (i) Color-synthesized image obtained from (h) and other recorded holograms. The numerical propagation distance used to obtain the image of (i) was 81 mm.
Fig. 6.
Fig. 6. Experimental results for light-multiplexed 3D imaging with the developed hologram recorder. (a) Schematic of experimental setup. (b) Spectral intensity of the self-luminous object. (c) One of the recorded holograms. Left and right Gabor zone plate patterns were generated with LED light and fluorescence light, respectively. (d), (e) Images reconstructed by commonly applied four-step PSI. The depth difference between (d) and (e) corresponded to 35 mm in the object plane. (f) LED light and (g) fluorescence light images of the objects. The numerical propagation distances of (f) and (g) were the same as those of (d) and (e), respectively.

Equations (35)

Equations on this page are rendered with MathJax. Learn more.

u t 1 ( x 1 , y 1 , z t 1 , k ) = c 1 u o ( b 1 a x o , b 1 a y o , a + b 1 , k ) ,
u t 2 ( x 2 , y 2 , z t 2 , k ) = c 2 u o ( b 2 a x o , b 2 a y o , a + b 2 , k ) ,
a = z l z o ,
b j  =  a f j a f j ( j = 1 , 2 ) ,
u 1 ( x 1 , y 1 , z 1 , k ) = u t 1 ( x 1 , y 1 , z t 1 + z d 1 , k ) ,
u 2 ( x 2 , y 2 , z 2 , k ) = u t 2 ( x 2 , y 2 , z t 2 + z d 2 , k ) ,
z d j  =  d 1 ( 1 / n j ) ,
I ( x 1 , y ) = c | u 1 z h ( x , y ; x 1 , y 1 , z 1 ; k ) + u 2 z h ( x , y ; x 2 , y 2 , z 2 ; k ) | 2 S ( k ) d k ,
u 1 z h ( x , y ; x 1 , y 1 , z 1 ; k ) = u 1 ( x 1 , y 1 , z 1 ; k ) exp [ i k r 1 ( x , y ; x 1 , y 1 , z 1 ) ] = | u 1 ( x 1 , y 1 , z 1 ; k ) | exp [ i ϕ 1 ( x 1 , y 1 , z 1 ; k ) ] exp [ i k ( x x 1 ) 2 + ( y y 1 ) 2 + ( z h z 1 ) 2 ] ,
u 2 z h ( x , y ; x 2 , y 2 , z 2 ; k ) = u 2 ( x 2 , y 2 , z 2 ; k ) exp [ i k r 2 ( x , y ; x 2 , y 2 , z 2 ) + i ϕ p m ( k ) ] = | u 2 ( x 2 , y 2 , z 2 ; k ) | exp [ i ϕ 2 ( x 2 , y 2 , z 2 ; k ) ] exp [ i k ( x x 2 ) 2 + ( y y 2 ) 2 + ( z h z 2 ) 2 + i ϕ p m ( k ) ] ,
r j ( x , y ; x j , y j , z j )  =  ( x x j ) 2 + ( y y j ) 2 + ( z h z j ) 2 ,   ( j = 1 , 2 ) ,
I ( x , y ) = c [ | u 1 ( x 1 , y 1 , z 1 ; k ) | 2 + | u 2 ( x 2 , y 2 , z 2 ; k ) | 2 + 2 | u 1 ( x 1 , y 1 , z 1 ; k ) | | u 2 ( x 2 , y 2 , z 2 ; k ) | × cos { k [ r 1 ( x , y ; x 1 , y 1 , z 1 ) r 2 ( x , y ; x 2 , y 2 , z 2 ) ] + ϕ 1 ( x 1 , y 1 , z 1 ; k ) ϕ 2 ( x 2 , y 2 , z 2 ; k ) ϕ p m ( k ) } ] S ( k ) d k .
I ( x , y ) = c [ | u 1 ( x 1 , y 1 , z 1 ; k ) | 2 + | u 2 ( x 2 , y 2 , z 2 ; k ) | 2 + 2 | u 1 ( x 1 , y 1 , z 1 ; k ) | | u 2 ( x 2 , y 2 , z 2 ; k ) | × cos { k [ r 1 ( x , y ; x 1 , y 1 , z 1 ) r 2 ( x , y ; x 2 , y 2 , z 2 ) ] + k ( z 1 z 2 ) ϕ p ( k ) } ] S ( k ) d k ,
ϕ p ( k ) = k δ Δ n .
I ( x , y ) = c { | u 1 ( x 1 , y 1 , z 1 ; k ) | 2 + | u 2 ( x 2 , y 2 , z 2 ; k ) | 2 + 2 | u 1 ( x 1 , y 1 , z 1 ; k ) | | u 2 ( x 2 , y 2 , z 2 ; k ) | × cos [ k { [ r 1 ( x , y ; x 1 , y 1 , z 1 ) r 2 ( x , y ; x 2 , y 2 , z 2 ) ] + ( z 1 z 2 ) ϕ p ( k ) k } ] } S ( k ) d k .
I ( x , y ) = c k c Δ k k c + Δ k { | u 1 ( x 1 , y 1 , z 1 ) | 2 + | u 2 ( x 2 , y 2 , z 2 ) | 2 + 2 | u 1 ( x 1 , y 1 , z 1 ) | | u 2 ( x 2 , y 2 , z 2 ) | × cos [ k { [ r 1 ( x , y ; x 1 , y 1 , z 1 ) r 2 ( x , y ; x 2 , y 2 , z 2 ) ] + ( z 1 z 2 ) ϕ p ( k ) k } ] } d k = c I 0 ( x , y ) + c q ( x , y , Δ k , ϕ p ) u z h ( x , y ) exp [ i ϕ p ( k c ) ]   + C . C . ,
q ( x , y , Δ k , ϕ p ) = sinc [ Δ k { [ r 1 ( x , y ; x 1 , y 1 , z 1 ) r 2 ( x , y ; x 2 , y 2 , z 2 ) ] + ( z 1 z 2 ) ϕ p ( k c ) k c } ]  ,
u z h ( x , y ) = I 1 ( x , y ) exp [ i k c { [ r 1 ( x , y ; x 1 , y 1 , z 1 ) r 2 ( x , y ; x 2 , y 2 , z 2 ) ] + ( z 1 z 2 ) } ] .
S ( k ) = ln 2 Δ k π exp { [ 2 ln 2 ( k k c ) 2 Δ k ] 2 } ,
I ( x , y ) = c { | u 1 ( x 1 , y 1 , z 1 ; k ) | 2 + | u 2 ( x 2 , y 2 , z 2 ; k ) | 2 + 2 | u 1 ( x 1 , y 1 , z 1 ; k ) | | u 2 ( x 2 , y 2 , z 2 ; k ) | ×   cos [ k { [ r 1 ( x , y ; x 1 , y 1 , z 1 ) r 2 ( x , y ; x 2 , y 2 , z 2 ) ] + ( z 1 z 2 ) ϕ p ( k ) k } ] } S ( k ) d k = c I 0 ( x , y ) + c q ( x , y , Δ k , ϕ p ) u z h ( x , y ) exp [ i ϕ p ( k c ) ]   + C . C . ,
q ( x , y , Δ k , ϕ p ) = exp ( [ 2 Δ k 2 ln 2 { [ r 1 ( x , y ; x 1 , y 1 , z 1 ) r 2 ( x , y ; x 2 , y 2 , z 2 ) ] + ( z 1 z 2 ) ϕ p ( k c ) k c } ] 2 )   .
h ( x , y , Δ k : ϕ p ( k c ) ) = I ( x , y )   d x o d y o d z o = c I 0 ( x , y )   d x o d y o d z o + c q ( x , y , Δ k , ϕ p ( k c ) )   u z h ( x , y ) exp [ i ϕ p ( k c ) ]   d x o d y o d z o + C . C . , = c I 0 ( x , y )   d x o d y o d z o + U ( x , y ) exp [ i ϕ p ( k c ) ] + C . C . ,
U ( x , y ) = c q ( x , y , Δ k , ϕ p ( k c ) )   u z h ( x , y )   d x o d y o d z o ,
H = Φ U ,
H = ( H 1 ( x , y ) H 2 ( x , y ) H 3 ( x , y ) H N ( x , y ) ) ,
Φ = ( 1 1 0 1 0 1 cos [ ϕ p 1 ( k 1 ) ] sin [ ϕ p 1 ( k 1 ) ] cos [ ϕ p 1 ( k M ) ] sin [ ϕ p 1 ( k M ) ] 1 cos [ ϕ p 2 ( k 1 ) ] sin [ ϕ p 2 ( k 1 ) ] cos [ ϕ p 2 ( k M ) ] sin [ ϕ p 2 ( k M ) ] 1 cos [ ϕ p ( N 1 ) ( k 1 ) ] sin [ ϕ p ( N 1 ) ( k 1 ) ] cos [ ϕ p ( N 1 ) ( k M ) ] sin [ ϕ p ( N 1 ) ( k M ) ] ) ,
U = ( c 0 I 0 ( x , y ) d x o d y o d z o R e [ c 1 q 1 ( x , y , Δ k 1 , ϕ p ( k 1 ) ) u z h 1 ( x , y ) d x o d y o d z o ] Im [ c 1 q 1 ( x , y , Δ k 1 , ϕ p ( k 1 ) ) u z h 1 ( x , y ) d x o d y o d z o ] R e [ c M q M ( x , y , Δ k M , ϕ p ( k M ) ) u z h M ( x , y ) d x o d y o d z o ] Im [ c M q M ( x , y , Δ k M , ϕ p ( k M ) ) u z h M ( x , y ) d x o d y o d z o ] ) .
U = Φ 1 H .
H = ( H 1 ( x , y ) H 2 ( x , y ) H 3 ( x , y ) H 4 ( x , y ) H 5 ( x , y ) H 6 ( x , y ) H 7 ( x , y ) H 8 ( x , y ) ) ,
Φ = ( 1 1 0 1 0 1 cos [ ϕ p 1 ( k 1 ) ] sin [ ϕ p 1 ( k 1 ) ] Γ 2 cos [ ϕ p 1 ( k 2 ) ] Γ 2 sin [ ϕ p 1 ( k 2 ) ] 1 cos [ ϕ p 2 ( k 1 ) ] sin [ ϕ p 2 ( k 1 ) ] Γ 2 cos [ ϕ p 2 ( k 2 ) ] Γ 2 sin [ ϕ p 2 ( k 2 ) ] 1 cos [ ϕ p 3 ( k 1 ) ] sin [ ϕ p 3 ( k 1 ) ] Γ 2 cos [ ϕ p 3 ( k 2 ) ] Γ 2 sin [ ϕ p 3 ( k 2 ) ] 1 Γ 1 cos [ ϕ p 4 ( k 1 ) ] Γ 1 sin [ ϕ p 4 ( k 1 ) ] 0 0 1 Γ 1 cos [ ϕ p 5 ( k 1 ) ] Γ 1 sin [ ϕ p 5 ( k 1 ) ] 0 0 1 Γ 1 cos [ ϕ p 6 ( k 1 ) ] Γ 1 sin [ ϕ p 6 ( k 1 ) ] 0 0 1 Γ 1 cos [ ϕ p 7 ( k 1 ) ] Γ 1 sin [ ϕ p 7 ( k 1 ) ] 0 0 ) ,
U = ( c 0 I 0 ( x , y ) d x o d y o d z o R e [ c 1 u z h 1 ( x , y ) d x o d y o d z o ] Im [ c 1 u z h 1 ( x , y ) d x o d y o d z o ] R e [ c 2 u z h 2 ( x , y ) d x o d y o d z o ] Im [ c 2 u z h 2 ( x , y ) d x o d y o d z o ] ) .
U 1 ( x , y ) = Γ 1 exp ( i ϕ p L ) { R e [ c 1 u z h 1 ( x , y ) d x o d y o d z o ] + i Im [ c 1 u z h 1 ( x , y ) d x o d y o d z o ] } = H 5 ( x , y ) H 7 ( x , y ) + i [ H 6 ( x , y ) H 8 ( x , y ) ] = H ( x , y : ϕ p L ) H ( x , y : ϕ p L + π ) + i [ H ( x , y : ϕ p L + π / 2 ) H ( x , y : ϕ p L + 3 π / 2 ) ] .
U 1 ( x , y ) = U 1 ( x , y ) Γ 1 exp ( i ϕ p L ) .
U 2 ( x , y ) = ( H 1 ( x , y ) H 3 ( x , y ) + i [ H 2 ( x , y ) H 4 ( x , y ) ] [ { 1 cos [ ϕ p 2 ( k 1 ) ] } + i { cos [ ϕ p 1 ( k 1 ) ] cos [ ϕ p 3 ( k 1 ) ] } ] R e [ U 1 ( x , y ) ] [ sin [ ϕ p 2 ( k 1 ) ] + i { sin [ ϕ p 3 ( k 1 ) ] sin [ ϕ p 1 ( k 1 ) ] } ] Im [ U 1 ( x , y ) ] ) [ 1 / ( 2 Γ 2 ) ] = ( H ( x , y : ϕ p ( k 2 ) = 0 ) H ( x , y : ϕ p ( k 2 ) = π ) + i [ H ( x , y : ϕ p ( k 2 ) = π / 2 ) H ( x , y : ϕ p ( k 2 ) = 3 π / 2 ) ] [ { 1 cos [ ϕ p 2 ( k 1 ) ] } + i { cos [ ϕ p 1 ( k 1 ) ] cos [ ϕ p 3 ( k 1 ) ] } ] R e [ U 1 ( x , y ) ] [ sin [ ϕ p 2 ( k 1 ) ] + i { sin [ ϕ p 3 ( k 1 ) ] sin [ ϕ p 1 ( k 1 ) ] } ] Im [ U 1 ( x , y ) ] ) [ 1 / ( 2 Γ 2 ) ] .
U 2 ( x , y ) = H 1 ( x , y ) H 3 ( x , y ) + i [ H 2 ( x , y ) H 4 ( x , y ) ] U 1 ( x , y ) 2 Γ 2 = H ( x , y : ϕ p = 0 ) H ( x , y : ϕ p = π ) + i [ H ( x , y : ϕ p = π / 2 ) H ( x , y : ϕ p = 3 π / 2 ) ] U 1 ( x , y ) 2 Γ 2 .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.