Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Quantitative real-time phase microscopy for extended depth-of-field imaging based on the 3D single-shot differential phase contrast (ssDPC) imaging method

Open Access Open Access

Abstract

Optical diffraction tomography (ODT) is a promising label-free imaging method capable of quantitatively measuring the three-dimensional (3D) refractive index distribution of transparent samples. In recent years, partially coherent ODT (PC-ODT) has attracted increasing attention due to its system simplicity and absence of laser speckle noise. Quantitative phase imaging (QPI) technologies represented by Fourier ptychographic microscopy (FPM), differential phase contrast (DPC) imaging and intensity diffraction tomography (IDT) need to collect several or hundreds of intensity images, which usually introduce motion artifacts when shooting fast-moving targets, leading to a decrease in image quality. Hence, a quantitative real-time phase microscopy (qRPM) for extended depth of field (DOF) imaging based on 3D single-shot differential phase contrast (ssDPC) imaging method is proposed in this research study. qRPM incorporates a microlens array (MLA) to simultaneously collect spatial information and angular information. In subsequent optical information processing, a deconvolution method is used to obtain intensity stacks under different illumination angles in a raw light field image. Importing the obtained intensity stack into the 3D DPC imaging model is able to finally obtain the 3D refractive index distribution. The captured four-dimensional light field information enables the reconstruction of 3D information in a single snapshot and extending the DOF of qRPM. The imaging capability of the proposed qRPM system is experimental verified on different samples, achieve single-exposure 3D label-free imaging with an extended DOF for 160 µm which is nearly 30 times higher than the traditional microscope system.

© 2024 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Monochromatic coherent light field can be described by a two-dimensional complex amplitude function, which contains two components, amplitude and phase [1]. The high temporal frequency of visible light and the limited bandwidth of detectors make it impossible to directly record phase information [2]. Phase information play an essential role in fields such as optical measurements, adaptive optics, biomedical sciences and materials science [3]. The cytoplasm of biological cells and most organelles have very weak absorptions, resulting in imaging with low contrast under ordinary brightfield illumination conditions. In order to address this issue, researchers have invented various exogenous labeling methods. The most common method using exogenous contrast is fluorescence microscopy [4,5]. However, such methods require fluorescent dyes or fluorescent proteins as biomarkers, so they are not suitable for samples that are non-fluorescent or cannot be easily fluorescently tagged. In addition, photobleaching and phototoxicity of fluorescent agents can prevent long-term imaging of live cells [6]. Accordingly, researchers began to explore label-free imaging methods to reflect the structure of the sample by the phase delay through the sample. The earliest label-free imaging method was the phase-contrast microscope invented by Zernike [7]. After that, differential interference contrast (DIC) microscopy was developed [8], but these imaging methods can only observe phase information qualitatively rather than quantitatively.

Quantitative phase imaging (QPI) is a label-free imaging method that can obtain the refractive index information of the target, thus providing high imaging contrast [9,10]. There are two implementation methods of QPI: interferometric method [11,12] and non-interferential method [1315]. By introducing the principle of interferometry, the phase and amplitude information of the object can be simultaneously calculated from the recorded interferogram. In order to obtain the three-dimensional (3D) refractive index information of thick samples, a tomographic method is required, the two-dimensional complex amplitude is synthesized in the 3D Fourier domain by changing the illumination angle or rotating the sample [16]. Optical diffraction tomography (ODT) is the most representative interference-based 3D QPI technique [17]. In recent years, ODT has been greatly developed. The first is the continuous developments of illumination angle scanning methods, such as scanning with rotating wedge prisms [18], robotic arm scanning [19], spatial light modulator scanning [16], and galvanometer scanning [20]. Among them, galvanometers can achieve high-speed and stable scanning, which is the most commonly used illumination with scanning method. The second is the continuous improvements of the light field acquisition method. In 2016, Momey et al. proposed a lensless diffraction tomography technology [19], which expanded the field of view (FOV) of imaging to the order of millimeters. Off-axis holographic microscopy [21] and common path holographic microscopy [22] have also been proposed recently. Besides, the traditional ODT uses a holographic recording method under a coherent light source, which is easily affected by environmental vibration and laser speckle. And angular multiplexing leads to spectral aliasing, which limits the effective numerical aperture of the imaging system.

The sensitivity of ODT to the measurement environment and the speckle noise from coherent light sources can limit its applicability in certain scenarios. Aside from the abovementioned limitations of ODT, intensity diffraction tomography (IDT) has been invented and developed in recent years, which can recover the 3D refractive index of a sample using only intensity information based on a non-interferometric setup [23]. In 2016, Michael et al. derived a linear 3D differential phase contrast model, which recovered the 3D refractive index distribution of the sample under asymmetric illumination conditions, but requiring the intensity stacks obtained by mechanical scanning [24]. Soto et al. extended the phase transfer function under paraxial conditions to non-paraxial conditions and applied it in diffraction tomography [25]. Since the performance of a tomography system is determined by the phase transfer function, which is ruled by the system illumination distribution function, the optimal illumination mode under asymmetric illumination [26] and symmetrical illumination conditions [27,28] is proposed by analyzing the phase transfer function under different illumination conditions, and the phase reconstruction with high signal-to-noise ratio and high sensitivity is achieved. To further improve the robustness and stability of the system, a non-scanning method that only modify the illumination angle has been developed [29] using LEDs on a rectangular LED array to achieve illumination scanning, but requires a huge data set, tens of seconds of imaging time.

To achieve 3D dynamic imaging of living biological samples, enhancing imaging speed has become a key research focus in QPI. Wavelength multiplexing is the most commonly used method to improve imaging speed [3032], but it introduces additional chromatic aberrations and asymmetric coverage in the Fourier space. Moreover, imperfect filtering of RGB Bayer filter in cameras and the wide bandwidth of RGB light sources can result in color crosstalk, leading to a deterioration in the quantitative ability of phase retrieval. The use of high-speed scanning elements to reduce the acquisition time of intensity images can also enhance the imaging speed, but it requires expensive hardware [33,34]. Fourier ptychographic microscopy (FPM) and IDT usually require multi-angle illumination, turning on multiple LEDs at the same time through the idea of multiplexing can reduce the amount of data that needs to be collected by an order of magnitude without losing resolution [3537]. Sun et al. introduced the phase-to-intensity conversion mechanism in FPM, requiring only sparse illumination on a ring that matches the numerical aperture of the objective to achieve complete Fourier spectrum coverage. This concept is not only applicable in FPM, but also in Fourier ptychographic diffraction tomography (FPDT) and IDT, which significantly improves imaging speed [38,39]. However, it still requires the acquisition of multiple intensity images to reconstruct one frame of 3D refractive index distribution. The commonality among the above methods is the requirement of multiple frames acquisition to achieve 3D refractive index tomography for static targets. Obviously, these methods are not suitable for dynamic targets. Attempting to achieve 3D refractive index tomography for dynamic targets by sacrificing image sensor sampling frequency will result in dynamic artifacts.

The limited depth of field (DOF) restricts the application of microscopes in large-scale imaging and 3D reconstruction. The most common solution for DOF extension is multiplane imaging, which allows for a wider DOF and 3D information reconstruction without sacrificing axial resolution. However, it requires mechanical scanning of the sample or movement of objective lens along the axial direction, significantly reducing system stability and imaging speed [4042]. Subsequently, methods using non-scanning devices to enhance imaging speed have been proposed, such as using electrowetting lenses that modulate lens curvature through electric field manipulation [43,44], and liquid crystal lenses that change focal length by altering the refractive index [4547]. However, these methods often fail to stably maintain a constant magnification during the zooming process. Aperture coding and wavefront coding can also achieve extended DOF by modulating the light field through the placement of intensity or phase masks in the aperture of optical system [4851]. However, due to the high degree of information multiplexing, it is easy to external noise and reconstruction noise [52,53]. Another approach involves introducing diffractive optical elements or using them instead of traditional lenses to achieve extended DOF [52]. Illumination from different angles leads lateral shifts in the image. By establishing the relationship between lateral shifts, defocus distance, and illumination angle, extended DOF can be realized through digital refocusing based on light transmission model [54,55]. Recently, a method has been proposed for extending DOF in holographic imaging and phase recovery by using a convolutional recurrent neural network (RNN) and significantly improving inference speed [56,57]. Partially coherent illumination is widely used in QPI, but the above-mentioned methods for extending DOF are rarely applicable in partially coherent quantitative phase imaging (PC-QPI).

Based on the above literature research and analysis, 3D information is crucial in microscopy imaging as it provides the depth and spatial structure of the sample, which is essential for understanding and analyzing complex organisms, material, and other specimens. Yet the time resolution limitations associated with scanning hinder its application in different fields. In order to realize single-shot phase tomography, the main problem to be solved is how to obtain multi-angle illumination images with a single-shot. Multi-angle illumination contains both phase contrast information and 3D information of the object. So, in this proposed method, the microlens array (MLA) is incorporated and placed at the back focal plane of the objective lens [58], the MLA segments the wavefront, transmits the spatial frequency (angle information) to the image sensor plane behind each lenslet, and forms images at the corresponding location. The captured raw light field is reconstructed into four directional illumination-light field (FDI-LF), and intensity stacks at different illumination angles are obtained through deconvolution, which improves the slicing ability of the system and achieves extended DOF. The Spirogyra and Paramecia are separately used in the experimental study to validate the 3D label-free imaging capabilities of the proposed quantitative real-time phase microscopy (qRPM) system.

2. 3D single-shot differential phase contrast imaging method

QPI usually relies on highly coherent light sources (e.g., laser) and more complex interference optical path. In contrast, partially coherent illumination can achieve twice the lateral resolution and avoid laser speckle noise. Phase contrast can be generated by using asymmetric illumination pattern [59], and only two images under asymmetric illumination conditions are needed to obtain the phase gradient of the sample in DPC imaging. In order to reduce the frequencies missing caused by single-axis illumination, there are two axis-symmetric illumination, and the phase gradient can be calculated by the following Eq. (1) [60] in the proposed method,

$${I_{DPC,n}} = \frac{{{I_{{D_{n,1}}}} - {I_{{D_{n,2}}}}}}{{{I_{{D_{n,1}}}} + {I_{{D_{n,2}}}}}},$$
where $n = 1,2$, $\{{{D_{1,1}},{D_{1,2}}} \}= \{{up,down} \}$, $\{{{D_{2,1}},{D_{2,2}}} \}= \{{left,right} \}$, ${I_{DPC,n}}$ represent DPC images in two directions. Traditional DPC imaging obtains asymmetric illumination by controlling the illumination pattern [13], which results in at least four snapshots of the image sensor. This method is not suitable for high-speed phase imaging. As shown in Fig. 1(a), a MLA was placed at the conjugate plane of the back focal plane of the objective lens in the proposed method, transmitting the angular information and spatial information to the image sensor simultaneously. Using this system configuration, spatial and angular components can be recorded simultaneously in an uncompromised and well-aliased manner. As shown in Fig. 1(b), in our system, each lenslet and objective lens construct a virtual microscopic imaging system to obtain intensity images at different illumination angles, and each elemental imaging system is linear and shift invariant, which allows computing depth slices directly using the deconvolution method that has uniform resolution through the system DOF. The intensity difference calculated from the complementary illumination angle image is then used to generate a DPC image correlated with the phase gradient. The spectrum and phase transfer function of the DPC image are deconvoluted to obtain the quantitative phase of the sample, with the Tikhonov regularization parameter introduced to avoid singularity.

 figure: Fig. 1.

Fig. 1. (a) A schematic of the experiment setup for quantitative real-time phase microscopy (qRPM). (b) Each lenslet and objective lens constitute a virtual microscopic imaging system to obtain intensity images at different illumination angles.

Download Full Size | PDF

The complex amplitude of a target that satisfies the weak object approximation (WOA) condition can be expressed as $o(\mathbf{r} )\textrm{ = }\textrm{exp} [ - \mu (\mathbf{r}) + \textrm{i}\varphi (\mathbf{r})] \approx 1 - \mu (\mathbf{r} )+ i\varphi (\mathbf{r} )$, where $\mu$ is the absorption of the sample, $\varphi$ is the phase of the sample, $\mathbf{r} = ({x,y} )$ are the space coordinates. When $o(\mathbf{r} )$ is illuminated by illumination $E(\mathbf{r} )$ (the Fourier transform pair of $E(\mathbf{u} )$) and passed through the low-pass filtering of the microscope imaging system, the intensity distribution on the image sensor is [2]:

$$I(\mathbf{r} )\textrm{ = }{\int\!\!\!\int {|{{\mathrm{{\cal F}}^{ - 1}}\{{\mathrm{{\cal F}}[{E(\mathbf{r} )o(\mathbf{r} )} ]\cdot P(\mathbf{u} )} \}} |} ^2}{d^2}\mathbf{u},$$
where $\mathrm{{\cal F}}$ and ${\mathrm{{\cal F}}^{ - 1}}$ represents Fourier transform and inverse Fourier transform, respectively, $P(\mathbf{u} )$ is the pupil function, $\mathbf{u}$ are the spatial frequency coordinates. Under the WOA, the intensity spectrum distribution at the image sensor can be written as [13,61,62]:
$$\mathrm{{\cal F}}[{I(\mathbf{r} )} ]\textrm{ = }{I_0}\mathrm{{\cal F}}[{\delta (\mathbf{r} )} ]+ {H_{abs}}(\mathbf{u} )\mathrm{{\cal F}}[{\mu (\mathbf{r} )} ]+ {H_{phi}}(\mathbf{u} )\mathrm{{\cal F}}[{\varphi (\mathbf{r} )} ],$$
where ${H_{abs}}(\mathbf{u} )$ and ${H_{phi}}(\mathbf{u} )$ are the absorption transfer function and phase transfer function, respectively. The transfer function of the system can be described by the transmission cross coefficient (TCC) model proposed by Hopkins [63,64]. After introducing the WOA, the detected intensity has a linear relationship with phase and absorption. The linear part of TTC is the weak object transfer function (WOTF), $WOTF = \int\!\!\!\int {E({\mathbf{u^{\prime}}} )P({\mathbf{u^{\prime} + u}} ){P^ \ast }({\mathbf{u^{\prime}}} )} {d^2}\mathbf{u^{\prime}}$, where $\mathbf{u^{\prime}}$ is the coordinate of the light source. Then the phase transfer function can be expressed [13,65]:
$${H_{phi}} = WOT{F^\ast }(\mathbf{u} )- WOTF({ - \mathbf{u}} ).$$
To quantitatively recover the phase, deconvolution operations are performed with ${I_{DPC,n}}$ and the corresponding $H_{phi}^{DPC}(\mathbf{u} )$. The absorption in ${I_{DPC,n}}$ is assumed to be 0, $H_{phi}^{DPC}(\mathbf{u} )= {{{H_{phi}}(\mathbf{u} )} / {{I_0}}}$, ${I_0}$ is background term.

In order to obtain the 3D refractive index distribution, the stack of ${I_{DPC,n}}$ must be obtained at first. The $N \times N$ (lenslet number) square MLA is used since the symmetric light field information needs to be obtained. The collected original light field information $LF({x,y,\upsilon ,\rho } )$ is reconstructed into FDI-LF, which $({\upsilon ,\rho } )$ is the angle domain of the light field,

$$\left\{ {\begin{array}{{c}} {L{F_{up}}({x,y} )= \sum\limits_{\upsilon = 1}^N {\sum\limits_{\rho = 1}^{{N / 2}} {LF({x,y,\upsilon ,\rho } )} } }\\ {L{F_{down}}({x,y} )= \sum\limits_{\upsilon = 1}^N {\sum\limits_{\rho = {N / 2}}^N {LF({x,y,\upsilon ,\rho } )} } }\\ {L{F_{left}}({x,y} )= \sum\limits_{\upsilon = 1}^{{N / 2}} {\sum\limits_{\rho = 1}^N {LF({x,y,\upsilon ,\rho } )} } }\\ {L{F_{right}}({x,y} )= \sum\limits_{\upsilon = {N / 2}}^N {\sum\limits_{\rho = 1}^N {LF({x,y,\upsilon ,\rho } )} } } \end{array}}, \right.$$
where $L{F_\kappa }({x,y} )$ ($\kappa = \{{up,down,left,right} \}$) is the light field under four illumination directions.

According to Eq. (5), the FDI-LF information from single-shot light field measurement can be obtained, which is equivalent to the light fields collected by illuminating the sample with partially coherent illumination from different directions in a wide-field microscope.

The elemental image recorded by the image sensor can be viewed as the convolution process of the object with the 3D point spread function (PSF) [66]. Therefore, the deconvolution based on the Richardson-Lucy iteration scheme is performed by $L{F_\kappa }({x,y} )$ and the corresponding 3D PSF to obtain the intensity stacks of ${I_{{D_{n,1}}}}$ and ${I_{{D_{n,2}}}}$ under asymmetric illumination [67,68]. The absorption distribution under symmetric illumination conditions can be achieved by adding ${I_{{D_{n,1}}}}$ and ${I_{{D_{n,2}}}}$. In the proposed system, the 3D PSF is determined by [69]:

$$\begin{array}{l} {U_{sens}}({{x_s},{y_s};\chi } )= \\ {\mathrm{{\cal F}}^{ - 1}}\left\{ {\mathrm{{\cal F}}\left\{ {{U_{AS}}\left( {\frac{{{x_{mla}}}}{{{M_{relay}}}},\frac{{{y_{mla}}}}{{{M_{relay}}}};\chi } \right) \cdot T({{x_{mla}},{y_{mla}}} )} \right\} \cdot \textrm{exp} \left( {ik \cdot {f_{ml}}\sqrt {1 - {{({\lambda {f_X}} )}^2} - {{({\lambda {f_Y}} )}^2}} } \right)} \right\} \end{array},$$
where ${U_{sens}}({{x_s},{y_s};\chi } )$ is the field distribution of the object point $\chi$ of the object surface on the image sensor, T is the transmittance function of MLA [70], ${U_{AS}}$ is the wavefront distribution at back focal plane of the objective, ${f_{ml}}$ is the focal length of MLA. ${M_{relay}}$ is relay magnification factor, ${M_{relay}} = {{{f_{fl}}} / {{f_{tube}}}}$, where ${f_{fl}}$ is the focal length of the Fourier lens (FL), ${f_{tube}}$ is the focal length of the Tube lens (TL), $({{x_s},{y_s}} )$ are the image sensor plane coordinates. $({{x_{mla}},{y_{mla}}} )$ are the MLA coordinates, and $({{f_X},{f_Y}} )$ are the spatial frequency of the image sensor plane [69], $\lambda$ is the wavelength.

$H_{phi}^{DPC}(\mathbf{u} )$ establishes the relationship between the DPC image and the quantitative phase information of the sample, and is used to characterize the process about how the quantitative phase of the sample is converted into an intensity image. Accordingly, the quantitative phase of the sample can be obtained by the deconvolution process. Substituting the intensity distribution ${I_{{D_{n,1}}}}$ and ${I_{{D_{n,2}}}}$ under asymmetric illumination in Eq. (1), a set of two-axis DPC images ${I_{DPC,n}}$ can be obtained. After deconvoluting the DPC image in Fourier domain and the phase transfer functions that is equivalent to Tikhonov regularization, the phase information of an object can be expressed as [13,71]:

$${\varphi _{\textrm{tik}}}({\textbf r}) = {{{\cal F}}^{ - 1}}\left\{ {\frac{{\mathop \sum \limits_j H{_{phi}}^{DPC}_j{^ *} ({\textbf u}) \cdot {{\cal{F}} [I_{\textrm{DPC},n,j}}({\textbf r})]}}{{\mathop \sum \limits_j {{\left| {H{{_{phi}^{DPC}}_j}({\textbf u})} \right|}^2} + \alpha }}} \right\},$$
where $\alpha$ is the regularization parameter, which is used to suppress the reconstruction error caused by the excessive amplification of the extremely small value in the phase transfer function during the inversion process, j is the index of DPC images measurement (left-right and up-down). Then the 3D phase distribution can be obtained by applying the process described in Eq. (7) to each group of intensities in the obtained intensity stack ${I_{DPC,n}}$.

3. Experimental results and discussion

3.1 Experimental setup

To validate the accuracy of our proposed method, we set up a qRPM experimental system on a Nikon inverted fluorescence microscope Ti2-U, including a 3D motorized stage which is used to control the sample’s position. An iris (GCM-5702 M, Daheng Optics, China) was placed at the native image plane (NIP) of the microscope. The radius of iris was adjusted to ensure that the sub-images of the recorded raw light field on the image sensor did not overlap. A Fourier lens (${f_{fl}} = 75mm$, Daheng Optics, China) and Tube lens formed the $4f$ imaging system, relaying the back focal plane of the objective to the back focal plane of the FL. A commercial microlens array (MLA, MRLQ(DD)-500-15, Chengdu Nano Elite Opto-tech Co. Ltd, ${f_{ml}} = 15.3mm$; pitch = 500 µm, ${r_{ml}} = 250\mu m$) was placed at the back focal plane of the FL to segment the wavefront. Finally, the raw light field images were recorded by an image sensor (HIKROBOT, MV-CH050-10UM, pixel size = 3.45 µm) placed at the back focal plane of the MLA. The system proposed in this paper uses a 20×, 0.4 NA objective lens (Nikon CFI Plan Apochromat 20×). A narrow-band filter with a central wavelength of 520 nm and a bandwidth of 10 nm (MBF10-520-10, Shenzhen LUBON Technology Co. Ltd.) was placed behind the illumination source to generate quasi-monochromatic light.

3.2 System imaging capability and data processing flow

The qRPM system is proposed and developed by inserting an MLA into the back focal plane of the objective lens of a traditional microscope and recording the far-field perspective view under each lenslet. Since the back focal plane of the objective lens is usually inaccessible, the qRPM system proposed in this paper developed a telecentric optical system consisting of TL and FL to conjugate the back focal plane of the objective lens and the MLA plane. A zoomed image of the iris is recorded at the image sensor plane behind each lenslet. The iris controls the diameter of the element image (EI) on the image sensor through [69]:

$${R_{EI}} = {r_{iris}}{{ \cdot {f_{ml}}} / {{f_{fl}}}},$$
where ${r_{iris}}$ is the iris radius, and ${R_{EI}}$ is the radius of the EI. In order to make the full use of the sensor pixels, EI needs to be mutually tangent to each other, then the aperture radius needs to satisfy the equation [69]:
$${r_{iris}} = {r_{ml}} \cdot {{{f_{fl}}} / {{f_{ml}}}}.$$
The FOV depends on ${r_{iris}}$, and can be obtained by solving the equation:
$${D_{FOV}} = 2 \cdot {r_{iris}} \cdot {{{f_{obj}}} / {{f_{tube}}}},$$
which is 123 µm. It is worthy to note that, since we used Pre-machined MLA with fixed optical parameters, its focal length is not optimal, and the FOV performance is not the best. Switching to a customized MLA is not only able to achieve a larger FOV, but also better exploits the potential of the system in terms of optical performance, reaching the upper limits of the system specifications. The lateral resolution at the NIP is determined by the diffraction limit of the MLA [66], according to the Abbey diffraction limit,
$${R_{NIP}} = ({{\lambda / {2N{A_{ml}}}}} ),$$
where $N{A_{ml}}$ is the numerical aperture of the lenslet. The converted ${R_{NIP}}$ to object space allows us to obtain lateral resolution:
$${R_{xy}} = ({{\lambda / {2N{A_{ml}}}}} )\times ({{{{f_{fl}}} / {{f_{ml}}}}} )\times ({{1 / M}} ),$$
which is 3.9 µm, where M is the magnification of the microscope objective. Assuming that the change of the 3D PSF within the axial range of interest is negligible, the axial resolution can be obtained directly from the ray optics [66],
$${R_z} = ({{\lambda / {2N{A_{ml}}}}} )\times ({{{{f_{fl}}} / {{f_{ml}}}}} )\times ({{1 / {\tan ({\theta^{\prime}} )}}} )\times ({{1 / {{M^2}}}} )= 20.6\mu m,$$
where $\tan ({\theta^{\prime}} )= {{\sqrt 2 {d_{mla}}} / {{f_{ml}}}}$, ${d_{mla}}$ is the pitch of the MLA. When the light source gradually moves away from the focal plane of the objective lens, the intensity information recorded by the image sensor will become weaker and weaker until it is lower than the dynamic range of the image sensor, so theoretically the DOF can be calculated by the detectable intensity range. The full-width at half-maximum (FWHM) of the PSF of the central lenslet is [66,72]:
$$({{{{M_T}^2\lambda } / {N{A^2}}}} )+ [{{{P{M_T}^2} / {({M \cdot NA} )}}} ],$$
which is 132.53 µm, where ${M_T} = ({{{2{f_{fl}}NA} / {{f_{tube}}}}} )\times ({{{{f_{obj}}} / {{d_{mla}}}}} )$, ${M_T}$ is the ratio of the image plane diameter to the EI diameter, ${f_{obj}}$ is the focal length of the objective lens, P is the image sensor pixel size, and NA is the numerical aperture of the objective lens. In this research work, the deconvolution method is used to reconstruct the object space, which can recover the diffraction information outside the Rayleigh range of the axial PSF, so the full width of PSF is used to describe the DOF. The proposed system can be regarded as a microscope array composed of multiple low numerical aperture microscopes. It has high angular sampling rate and uses deconvolution algorithms to reconstruct the object space, therefore extending the DOF accordingly. In addition, when the light source is away from the focal plane of the objective lens, the light field image will be enlarged laterally, which makes the DOF limited by the MLA pitch, and the value is: ${d_{mla}} \times ({{{{f_{fl}}} / {{f_{ml}}}}} )\times ({{1 / {\tan (\theta )}}} )\times ({{1 / {{M^2}}}} )= 919.1\mu m$, where $\tan (\theta ) = {{{d_{mla}}} / {{f_{fl}}}}$. Once the distance between the light source and the focal plane of objective lens exceeds the DOF, the light source will not be completely imaged by a lenslet. So, the final DOF should be 265.06 µm, since the information exceeding the smaller value of this parameter cannot be recorded.

To evaluate the accuracy practical resolution, and DOF of the proposed method, we performed repeated measurements of 30 times on PMMA microspheres (immersed in a liquid with a refractive index of 1.518) with a diameter of 13 µm (refractive index = 1.49) and USAF target as the artefacts for system evaluation. The data processing flow is shown in Fig. 2. The raw light field image is reconstructed into FDI-LF (In Fig. 2(a)), the four colored boxes of red, yellow, blue and green, respectively represent light fields in four different illumination directions, and deconvolute with the corresponding 3D PSF (Fig. 2(b)) to obtain the intensity stack of different illumination directions (Fig. 2(c)). The deconvolution algorithm endows the system with strong slicing ability. The axial spacing d between adjacent reconstructed intensity image is selected to be 1 µm in order to differentiate features between each slice. Figure 2(d) shows the DPC image calculated according to Eq. (1). Equations (2)–(4) and Eq. (7) show the process of obtaining the quantitative phase, and finally calculate the refractive index according to the formula $\Delta n = {{\lambda \varphi } / {2\pi d}}$. The first column of Fig. 2(e) shows the refractive index slices of the microspheres along the x-y direction, x-z direction and y-z direction. The second column of Fig. 2(e) shows the 3D refractive index visualization of microspheres. The uneven distribution of the light source results in the inaccurate phase calculation, so the image of the uniform plane is collected as calibration. Figure 2(f) shows the profiles of PMMA microspheres in the x-direction, y-direction and z-direction corresponding to the dotted line position in Fig. 2(e). The refractive index of PMMA microspheres is smaller than that of the surrounding liquid, so in order to make diagram more intuitive, we inverted the ordinate. Since the microspheres have obvious absorption at the edges, the reconstructed phase distribution will have halo artifacts, as shown in the yellow dotted circle in Fig. 2(f). The axial resolution in the actual experiment was 21.96 µm, which is consistent with the theoretical calculation. It can be seen that the reconstructed refractive index is between 1.485-1.50 in Fig. 2(e)-(f), which is consistent with the actual refractive index of PMMA material, and the error comes from the use of WOA and temperature changes.

 figure: Fig. 2.

Fig. 2. Data processing flow. (a) The raw light field image is reconstructed into FDI-LF, and the four colored boxes of red, yellow, blue and green respectively represent light fields in four different illumination directions. (b) Simulated 3D PSF. (c) shows the intensity stack of four different illumination directions. (d) shows the DPC image calculated according to Eq. (1). (e) The first column of (e) shows the refractive index slices of the microspheres along the x-y direction, x-z direction and y-z direction. The second column of (e) shows the 3D refractive index visualization of microspheres. (f) The profiles of PMMA microspheres in the x-direction, y-direction and z-direction corresponding to the dotted line position in (e).

Download Full Size | PDF

DOF is an important parameter of imaging system. The proposed qRPM system can be regarded as multiple low numerical aperture microscope systems that simultaneously collect wide-field images from multiple angles (Fig. 1(b)), which increases the FWHM of the PSF of the imaging system and expands the system DOF. In order to verify these outstanding advantages, a USAF target was placed at the $Z ={-} 80\mu m$, $Z ={-} 60\mu m$, $Z ={-} 40\mu m$, $Z ={-} 20\mu m$, $Z = 0\mu m$ (focal plane of the objective lens), $Z = 20\mu m$, $Z = 40\mu m$, $Z = 60\mu m$, $Z = 80\mu m$ planes (use a motorized stage to move in the z-axis to place the USAF target in different positions), and the absorption of the target at different depths was reconstructed through the self-developed data processing method. In Fig. 3, the intensity image and cross-sectional grayscale image shown on the right belong to the DOF of the proposed system that exceeds 160 µm. The intensity images are the absorption reconstruction of group 7 element 5 and element 6 of the USAF target, with linewidths of 2.46 µm and 2.19 µm. This clearly proves the proposed system can achieve the full-pitch resolution of 4.38 µm at all positions within the DOF, which is consistent with the theoretical results. The far-left side of Fig. 3 locates the intensity images recorded at $Z ={\pm} 10\mu m$ and $Z ={\pm} 5\mu m$ planes with the same 20× objective lens incorporated by a traditional bright field microscope. When the object depth is greater than 10 µm, the target cannot be recorded by traditional microscope due to the limited DOF. This parameter comparison fully demonstrates the superiority of the DOF extension of the proposed qRPM system.

 figure: Fig. 3.

Fig. 3. Comparison of DOF between traditional brightfield microscopy and our proposed method. The blue bar indicates that the conventional brightfield microscope has a DOF within −10 µm – 10 µm. The corresponding intensity image is shown on the left. The green bar indicates that our proposed method has a DOF from −80 µm to 80 µm. The intensity image is displayed on the right, and sectional grayscale profiles are also provided to validate the full-pitch resolution of 4.38 µm at all positions within the DOF.

Download Full Size | PDF

In order to further demonstrate the advantages of our proposed method in terms of imaging speed, we compare traditional 3D DPC [24], IDT [29,38], FPDT [39], Intensity-based holographic imaging via Space-domain Kramers–Kronig relations [10], and our proposed method in terms of the number of intensity images required for reconstructing a 3D refractive index volume. The comparison results are clearly showed in Table 1. Only our proposed method is capable to image at the top capture speed of the image sensor. This superiority of the imaging speed clearly shows that the proposed method offers higher potential abilities for high-speed imaging, while significantly reducing the effects of motion artifacts simultaneously.

Tables Icon

Table 1. Comparison of the number of intensity images required to reconstruct a 3D refractive index volume

3.3 Static object reconstruction with qRPM

The static sample of Spirogyra is placed in aqueous solution (refractive index = 1.33) using qRPM for 3D single-shot differential phase contrast imaging process. Figure 4(a) shows a 3D visualization of the refractive index of Spirogyra. The distribution of chloroplasts of Spirogyra is in a zigzag distribution, and the outer shell of Spirogyra with higher transmittance is also clearly reconstructed. The obtained refractive index of Spirogyra is between 1.35-1.41 also matches well with previous experimental measurements [73]. Figure 4(b) shows the DPC images, phase distribution and refractive index distribution of Spirogyra at different depths. In the blue dotted box of Fig. 4(b), the left-right DPC fails to exhibit the lateral features of Spirogyra, while the up-down DPC fills in the missing frequency. Reconstructing phase with single-axis DPC image creates frequency missing problems. In the proposed ssDPC method, combining DPC images in both two directions reduces the frequency missing. Figure 4(c) shows the x-y plane slice at $Z = 0\mu m$ and orthogonal maximum-intensity projections of the Spirogyra. Figure 4(d1) shows the refractive index distribution at the green dotted line in Fig. 4(c). It is clearly illustrated that the tiny structure of 4.04 µm can be resolved in the Spirogyra sample. Figure 4(d2) is the axial refractive index distribution at the position of the red dotted line in Fig. 4(c). This clearly demonstrates that the system can resolve detailed information with an axial length of around 23 µm. Accordingly, in experiments using Spirogyra as samples, the actual resolution of the system is consistent with the theoretical resolution.

 figure: Fig. 4.

Fig. 4. (a) shows a 3D visualization of the refractive index of Spirogyra. (b) shows the DPC images, phase distribution and refractive index distribution of Spirogyra at different depths. (c) shows the x-y plane slice at $Z = 0\mu m$ and orthogonal maximum-intensity projections of the Spirogyra. (d1) shows the refractive index distribution at the green dotted line in (c). (d2) is the axial refractive index distribution at the position of the red dotted line in (c).

Download Full Size | PDF

3.4 Dynamic object reconstruction with qRPM

To further demonstrate the advantages of our method for label-free 3D imaging of dynamic objects, Fig. 5, Visualization 1 and Visualization 2 show the 3D refractive index imaging of a lively paramecium (immersed in a liquid with a refractive index of 1.33) using qRPM. Visualization 1 shows the dynamic refractive index distribution of paramecium at different reconstruction depths. The proposed system has high phase sensitivity, and the change in refractive index around the paramecium caused by the swing of the paramecium cilia can be clearly identified (shown in the dotted box in Visualization 1). Visualization 2 demonstrate the dynamic 3D refractive index distribution of locomotive morphology of paramecium, which can be used to support the behavior analysis of the organism. The entire movement process of paramecium in the FOV is recorded at an imaging speed of 74 Hz (the maximum frame rate of the camera). As shown in Fig. 5, a few specific moments are selected to demonstrate the locomotive morphology of paramecium. Figure 5(a) is the 3D visualization of the refractive index of paramecium. Figure 5(b) shows the refractive index information at different depths. From 0.45s to 0.62s, it is excited to note that the algae are swallowed by the paramecium can be clearly seen which are highlighted in the yellow box. This is an extremely visual demonstration of the excellent time resolution ability of the proposed qRPM system. Slices in different axial planes demonstrate the good optical-slicing ability of the system by providing adjustable reconstruction step value. Figure 5(c1) and (c2) are the enlarged refractive index distribution at the $Z = 0\mu m$ position at $t = 0.45s$ and $t = 0.62s$ and their related x-z, y-z slices at the position indicated by the white dotted line. On the x-z, y-z slices of Fig. 5(c1) and (c2), the corresponding information for the algae can be clearly found (also indicated by yellow boxes), proving that the spatial location of the algae is inside the paramecium, further confirming the interaction between the paramecium and the algae. These results vividly prove that qRPM is capable of observing and quantitatively analyzing the dynamic organisms in a label-free manner and without motion artifacts. Figure 5(d1) are the refractive index distributions at the green straight line (located at lateral plane) in Fig. 5(c1) and Fig. 5(d2) are the refractive index distributions at the red straight line (axial direction) in Fig. 5(c2), which clearly shows the proposed system is able to achieve the best full-pitch resolution of 3.33 µm and an axial resolution of 18.69 µm.

 figure: Fig. 5.

Fig. 5. (a) shows the 3D visualization of the refractive index of paramecium. (b) shows the refractive index information at different depths at different points on the timeline. (c1) - (c2) are the enlarged refractive index distribution at the $Z = 0\mu m$ position at $t = 0.45s$ and $t = 0.62s$ and x-z, y-z slices at the position of the white dotted line. (d1) and (d2) are the refractive index distributions at the green straight line and red straight line in (c1) and (c2) respectively.

Download Full Size | PDF

4. Conclusion

In summary, a quantitative real-time phase microscopy (qRPM) system is developed based on the proposed 3D single-shot differential phase contrast imaging method (ssDPC). There is no need for complex optical path adjustments, and scan-free process without any movement of sample or imaging module can obtain the 3D refractive index reconstruction of the sample, only a novel imaging module consists of a MLA and an image sensor needs to be incorporated to a traditional microscopic system. Additionally, the proposed system does not reduce the sampling frequency of the image sensor and does not sacrifice the system's inherent image acquisition capabilities in exchange for imaging capabilities enhancement. Moreover, our proposed method allows for the reconstruction of 3D refractive index distribution using only a single frame image. In order to verify the quantitative capabilities of the proposed imaging method and its associated qRPM system for refractive index, PMMA microspheres of known size and refractive index is used as the artefacts in repeated measurements process. Imaging resolution target within a wide depth range shows that the proposed system extends the DOF of a 20×, NA = 0.4 microscope objective to 160 µm, which is nearly 30 times higher than 6 µm of the traditional microscope system. Imaging of the transparent shell of algae demonstrates the performance of the proposed system on real static samples. We have achieved 74 Hz high-speed imaging of free-moving paramecium, proving the dynamic imaging capability of the system. As the need for label-free imaging grows, our approach could open up many biomedical imaging applications. Moreover, considering these advantages, it is promising to develop a modularized qRPM component which is able to attached to an existed microscopy system to realize ssDPC imaging process for live object in future.

In this research study, MLA is incorporated to obtain 4D light field data, which inevitably leads to a trade-off relationship between spatial resolution and angular resolution. Undoubtedly this will result in the system resolution not being able to reach the resolution limit of the microscope objective. Some recent works have achieved high-resolution 3D imaging by increasing the imaging throughput through high-speed vibration of the MLA and eliminating aberrations using digital adaptive optics [74]. Some research teams have overcome the disadvantage of illumination needs matching the numerical aperture of the objective lens in quantitative phase imaging by acquiring through-focus intensity images [75,76]. Through-focus intensity stacks can be easily obtained using light field microscopy, which can provide a novel idea for overcoming numerical aperture matching. These will be considered in our future work.

Funding

National Natural Science Foundation of China (52375549); Nankai University Eye Institute (NKYKD202204, NKYKK202209); China Postdoctoral Science Foundation (Grant No.2022M721695).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. J. W. Goodman and P. Sutton, “Introduction to Fourier optics,” Quantum Semiclassical Opt.-J. Eur. Opt. Soc. Part B 8, 1095 (1996).

2. M. Born and E. Wolf, Principles of optics: electromagnetic theory of propagation, interference and diffraction of light (Elsevier, 2013).

3. Y. Park, C. Depeursinge, and G. Popescu, “Quantitative phase imaging in biomedicine,” Nat. Photonics 12(10), 578–589 (2018). [CrossRef]  

4. H. Giloh and J. W. Sedat, “Fluorescence microscopy: reduced photobleaching of rhodamine and fluorescein protein conjugates by n-propyl gallate,” Science 217(4566), 1252–1255 (1982). [CrossRef]  

5. J. Chen, H. Sasaki, H. Lai, et al., “Three-dimensional residual channel attention networks denoise and sharpen fluorescence microscopy image volumes,” Nat. Methods 18(6), 678–687 (2021). [CrossRef]  

6. D. J. Stephens and V. J. Allan, “Light microscopy techniques for live cell imaging,” Science 300(5616), 82–86 (2003). [CrossRef]  

7. F. Zernike, “Phase contrast, a new method for the microscopic observation of transparent objects,” Physica 9(7), 686–698 (1942). [CrossRef]  

8. G. Nomarski, “Differential microinterferometer with polarized waves,” J. Phys. Radium Paris 16, 9S (1955).

9. P. Ledwig and F. E. Robles, “Quantitative 3D refractive index tomography of opaque samples in epi-mode,” Optica 8(1), 6–14 (2021). [CrossRef]  

10. Y. Baek and Y. Park, “Intensity-based holographic imaging via space-domain Kramers–Kronig relations,” Nat. Photonics 15(5), 354–360 (2021). [CrossRef]  

11. K. Kim, Z. Yaqoob, K. Lee, et al., “Diffraction optical tomography using a quantitative phase imaging unit,” Opt. Lett. 39(24), 6935–6938 (2014). [CrossRef]  

12. Y. Sung, W. Choi, C. Fang-Yen, et al., “Optical diffraction tomography for high resolution live cell imaging,” Opt. Express 17(1), 266–277 (2009). [CrossRef]  

13. L. Tian and L. Waller, “Quantitative differential phase contrast imaging in an LED array microscope,” Opt. Express 23(9), 11394–11403 (2015). [CrossRef]  

14. C. Zuo, Q. Chen, W. Qu, et al., “High-speed transport-of-intensity phase microscopy with an electrically tunable lens,” Opt. Express 21(20), 24060–24075 (2013). [CrossRef]  

15. L. Lu, J. Li, Y. Shu, et al., “Hybrid brightfield and darkfield transport of intensity approach for high-throughput quantitative phase microscopy,” Adv. Photonics 4(05), 056002 (2022). [CrossRef]  

16. S. Chowdhury, W. J. Eldridge, A. Wax, et al., “Refractive index tomography with structured illumination,” Optica 4(5), 537–545 (2017). [CrossRef]  

17. W. Choi, C. Fang-Yen, K. Badizadegan, et al., “Tomographic phase microscopy,” Nat. Methods 4(9), 717–719 (2007). [CrossRef]  

18. Y. Cotte, F. Toy, P. Jourdain, et al., “Marker-free phase nanoscopy,” Nat. Photonics 7(2), 113–117 (2013). [CrossRef]  

19. F. Momey, A. Berdeu, T. Bordy, et al., “Lensfree diffractive tomography for the imaging of 3D cell cultures,” Biomed. Opt. Express 7(3), 949–962 (2016). [CrossRef]  

20. D. Dong, X. Huang, L. Li, et al., “Super-resolution fluorescence-assisted diffraction computational tomography reveals the three-dimensional landscape of the cellular organelle interactome,” Light: Sci. Appl. 9(1), 11 (2020). [CrossRef]  

21. K. Kim, J. Yoon, S. Shin, et al., “Optical diffraction tomography techniques for the study of cell pathophysiology,” J. Biomed. Photonics Eng. 2, 020201 (2016).

22. T. Kim, R. Zhou, M. Mir, et al., “White-light diffraction tomography of unlabelled live cells,” Nat. Photonics 8(3), 256–263 (2014). [CrossRef]  

23. G. Gbur and E. Wolf, “Diffraction tomography without phase information,” Opt. Lett. 27(21), 1890–1892 (2002). [CrossRef]  

24. M. Chen, L. Tian, and L. Waller, “3D differential phase contrast microscopy,” Biomed. Opt. Express 7(10), 3940–3950 (2016). [CrossRef]  

25. J. M. Soto, J. A. Rodrigo, and T. Alieva, “Optical diffraction tomography with fully and partially coherent illumination in high numerical aperture label-free microscopy,” Appl. Opt. 57(1), A205–A214 (2018). [CrossRef]  

26. Y. Fan, J. Sun, Q. Chen, et al., “Optimal illumination scheme for isotropic quantitative differential phase contrast microscopy,” Photonics Res. 7(8), 890–904 (2019). [CrossRef]  

27. J. Li, Q. Chen, J. Sun, et al., “Optimal illumination pattern for transport-of-intensity quantitative phase microscopy,” Opt. Express 26(21), 27599–27614 (2018). [CrossRef]  

28. J. Li, N. Zhou, Z. Bai, et al., “Optimization analysis of partially coherent illumination for refractive index tomographic microscopy,” Opt. Lasers Eng. 143, 106624 (2021). [CrossRef]  

29. R. Ling, W. Tahir, H.-Y. Lin, et al., “High-throughput intensity diffraction tomography with a computational microscope,” Biomed. Opt. Express 9(5), 2130–2141 (2018). [CrossRef]  

30. N. Zhou, J. Li, J. Sun, et al., “Single-exposure 3D label-free microscopy based on color-multiplexed intensity diffraction tomography,” Opt. Lett. 47(4), 969–972 (2022). [CrossRef]  

31. C. Lee, Y. Baek, H. Hugonnet, et al., “Single-shot wide-field topography measurement using spectrally multiplexed reflection intensity holography via space-domain Kramers–Kronig relations,” Opt. Lett. 47(5), 1025–1028 (2022). [CrossRef]  

32. Y.-J. Chen, Y.-Z. Lin, S. Vyas, et al., “Time-lapse imaging using dual-color coded quantitative differential phase contrast microscopy,” J. Biomed. Opt. 27(05), 056002 (2022). [CrossRef]  

33. G. Kim, M. Lee, S. Youn, et al., “Measurements of three-dimensional refractive index tomography and membrane deformability of live erythrocytes from Pelophylax nigromaculatus,” Sci. Rep. 8(1), 9192 (2018). [CrossRef]  

34. J. A. Rodrigo, J. M. Soto, and T. Alieva, “Fast label-free microscopy technique for 3D dynamic quantitative imaging of living cells,” Biomed. Opt. Express 8(12), 5507–5517 (2017). [CrossRef]  

35. A. Matlock and L. Tian, “High-throughput, volumetric quantitative phase imaging with multiplexed intensity diffraction tomography,” Biomed. Opt. Express 10(12), 6432–6448 (2019). [CrossRef]  

36. L. Tian, Z. Liu, L.-H. Yeh, et al., “Computational illumination for high-speed in vitro Fourier ptychographic microscopy,” Optica 2(10), 904–911 (2015). [CrossRef]  

37. L. Tian, X. Li, K. Ramchandran, et al., “Multiplexed coded illumination for Fourier Ptychography with an LED array microscope,” Biomed. Opt. Express 5(7), 2376–2389 (2014). [CrossRef]  

38. J. Li, A. Matlock, Y. Li, et al., “High-speed in vitro intensity diffraction tomography,” Adv. Photonics 1, 066004 (2019). [CrossRef]  

39. S. Zhou, J. Li, J. Sun, et al., “Accelerated Fourier ptychographic diffraction tomography with sparse annular LED illuminations,” J. Biophotonics 15(3), e202100272 (2022). [CrossRef]  

40. C.-W. Chen, M. Cho, Y.-P. Huang, et al., “Three-dimensional imaging with axially distributed sensing using electronically controlled liquid crystal lens,” Opt. Lett. 37(19), 4125–4127 (2012). [CrossRef]  

41. M. Martı, P.-Y. Hsieh, A. Doblas, et al., “Fast axial-scanning widefield microscopy with constant magnification and resolution,” J. Display Technol. 11(11), 913–920 (2015). [CrossRef]  

42. J. Li, Q. Chen, J. Sun, et al., “Three-dimensional tomographic microscopy technique with multi-frequency combination with partially coherent illuminations,” Biomed. Opt. Express 9(6), 2526–2542 (2018). [CrossRef]  

43. Y. Chen, H. Liu, Y. Zhou, et al., “Extended the depth of field and zoom microscope with varifocal lens,” Sci. Rep. 12(1), 11015 (2022). [CrossRef]  

44. G. Barbera, R. Jun, Y. Zhang, et al., “A miniature fluorescence microscope for multi-plane imaging,” Sci. Rep. 12(1), 16686 (2022). [CrossRef]  

45. P.-Y. Hsieh, P.-Y. Chou, H.-A. Lin, et al., “Long working range light field microscope with fast scanning multifocal liquid crystal microlens array,” Opt. Express 26(8), 10981–10996 (2018). [CrossRef]  

46. X. Shen, Y.-J. Wang, H.-S. Chen, et al., “Extended depth-of-focus 3D micro integral imaging display using a bifocal liquid crystal lens,” Opt. Lett. 40(4), 538–541 (2015). [CrossRef]  

47. Y.-H. Lin, M.-S. Chen, and H.-C. Lin, “An electrically tunable optical zoom system using two composite liquid crystal lenses with a large zoom ratio,” Opt. Express 19(5), 4714–4721 (2011). [CrossRef]  

48. M. Mino and Y. Okano, “Improvement in the OTF of a defocused optical system through the use of shaded apertures,” Appl. Opt. 10(10), 2219–2225 (1971). [CrossRef]  

49. U. Akpinar, E. Sahin, M. Meem, et al., “Learning wavefront coding for extended depth of field imaging,” IEEE Trans. on Image Process. 30, 3307–3320 (2021). [CrossRef]  

50. Y. Wu, V. Boominathan, H. Chen, et al., “Phasecam3d—learning phase masks for passive single view depth estimation,” in 2019 IEEE International Conference on Computational Photography (ICCP), (IEEE, 2019), 1–12.

51. O. E. Olarte, J. Andilla, D. Artigas, et al., “Decoupled illumination detection in light sheet microscopy for fast volumetric imaging,” Optica 2(8), 702–705 (2015). [CrossRef]  

52. J. Greene, Y. Xue, J. Alido, et al., “Pupil engineering for extended depth-of-field imaging in a fluorescence miniscope,” Neurophotonics 10(04), 044302 (2023). [CrossRef]  

53. S. Chen and Z. Fan, “Optimized asymmetrical tangent phase mask to obtain defocus invariant modulation transfer function in incoherent imaging systems,” Opt. Lett. 39(7), 2171–2174 (2014). [CrossRef]  

54. S. Zhang, G. Zhou, C. Zheng, et al., “Fast digital refocusing and depth of field extended Fourier ptychography microscopy,” Biomed. Opt. Express 12(9), 5544–5558 (2021). [CrossRef]  

55. L. Tian and L. Waller, “3D intensity and phase imaging from light field measurements in an LED array microscope,” Optica 2(2), 104–111 (2015). [CrossRef]  

56. Y. Wu, Y. Rivenson, Y. Zhang, et al., “Extended depth-of-field in holographic imaging using deep-learning-based autofocusing and phase recovery,” Optica 5(6), 704–710 (2018). [CrossRef]  

57. L. Huang, T. Liu, X. Yang, et al., “Holographic image reconstruction with phase recovery and autofocusing using recurrent neural networks,” ACS Photonics 8(6), 1763–1774 (2021). [CrossRef]  

58. A. Llavador, J. Sola-Pikabea, G. Saavedra, et al., “Resolution improvements in integral microscopy with Fourier plane recording,” Opt. Express 24(18), 20792–20798 (2016). [CrossRef]  

59. B. Kachar, “Asymmetric illumination contrast: a method of image formation for video light microscopy,” Science 227(4688), 766–768 (1985). [CrossRef]  

60. B. Xiong, X. Li, Y. Zhou, et al., “Snapshot Partially Coherent Diffraction Tomography,” Phys. Rev. Appl. 15(4), 044048 (2021). [CrossRef]  

61. D. K. Hamilton, C. J. R. Sheppard, and T. Wilson, “Improved imaging of phase gradients in scanning optical microscopy,” J. Microsc. 135(3), 275–286 (1984). [CrossRef]  

62. N. Streibl, “Three-dimensional imaging by a microscope,” J. Opt. Soc. Am. A 2(2), 121–127 (1985). [CrossRef]  

63. C. Sheppard and A. Choudhury, “Image formation in the scanning microscope,” Opt. Acta 24(10), 1051–1073 (1977). [CrossRef]  

64. H. H. Hopkins, “On the diffraction theory of optical images,” Proc. R. Soc. Lond. A 217(1130), 408–432 (1953). [CrossRef]  

65. C. Zuo, J. Li, J. Sun, et al., “Transport of intensity equation: a tutorial,” Opt. Lasers Eng. 135, 106187 (2020). [CrossRef]  

66. C. Guo, W. Liu, X. Hua, et al., “Fourier light-field microscopy,” Opt. Express 27(18), 25573–25594 (2019). [CrossRef]  

67. K. Ac, “Algorithms for reconstruction with nondiffracting sources,” Princ. Comput. Tomog. Imaging 1, 49–112 (2001).

68. F. Dell’Acqua, G. Rizzo, P. Scifo, et al., “A model-based deconvolution approach to solve fiber crossing in diffusion-weighted MR imaging,” IEEE Trans. Biomed. Eng. 54(3), 462–472 (2007). [CrossRef]  

69. A. Stefanoiu, G. Scrofani, G. Saavedra, et al., “What about computational super-resolution in fluorescence Fourier light field microscopy?” Opt. Express 28(11), 16554–16568 (2020). [CrossRef]  

70. A. Stefanoiu, J. Page, P. Symvoulidis, et al., “Artifact-free deconvolution in light field microscopy,” Opt. Express 27(22), 31644–31666 (2019). [CrossRef]  

71. M. Bertero, P. Boccacci, and C. De Mol, Introduction to inverse problems in imaging (CRC press, 2021).

72. L. Galdón, G. Saavedra, J. Garcia-Sucerquia, et al., “Fourier lightfield microscopy: a practical design guide,” Appl. Opt. 61(10), 2558–2564 (2022). [CrossRef]  

73. E. Aas, “Refractive index of phytoplankton derived from its metabolite composition,” J. Plankton Res. 18(12), 2223–2249 (1996). [CrossRef]  

74. J. Wu, Y. Guo, C. Deng, et al., “An integrated imaging sensor for aberration-corrected 3D photography,” Nature 612(7938), 62–71 (2022). [CrossRef]  

75. J. Li, N. Zhou, J. Sun, et al., “Transport of intensity diffraction tomography with non-interferometric synthetic aperture for three-dimensional label-free microscopy,” Light: Sci. Appl. 11(1), 154 (2022). [CrossRef]  

76. S. Zhou, J. Li, J. Sun, et al., “Transport-of-intensity Fourier ptychographic diffraction tomography: defying the matched illumination condition,” Optica 9(12), 1362–1373 (2022). [CrossRef]  

Supplementary Material (3)

NameDescription
Visualization 1       Visualization 1 shows the dynamic refractive index distribution of Paramecium at different reconstruction depths.
Visualization 2       Visualization 2 shows the dynamic 3D refractive index distribution of Paramecium.
Visualization 3       Visualization 1 shows the dynamic refractive index distribution of Paramecium at different reconstruction depths.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (5)

Fig. 1.
Fig. 1. (a) A schematic of the experiment setup for quantitative real-time phase microscopy (qRPM). (b) Each lenslet and objective lens constitute a virtual microscopic imaging system to obtain intensity images at different illumination angles.
Fig. 2.
Fig. 2. Data processing flow. (a) The raw light field image is reconstructed into FDI-LF, and the four colored boxes of red, yellow, blue and green respectively represent light fields in four different illumination directions. (b) Simulated 3D PSF. (c) shows the intensity stack of four different illumination directions. (d) shows the DPC image calculated according to Eq. (1). (e) The first column of (e) shows the refractive index slices of the microspheres along the x-y direction, x-z direction and y-z direction. The second column of (e) shows the 3D refractive index visualization of microspheres. (f) The profiles of PMMA microspheres in the x-direction, y-direction and z-direction corresponding to the dotted line position in (e).
Fig. 3.
Fig. 3. Comparison of DOF between traditional brightfield microscopy and our proposed method. The blue bar indicates that the conventional brightfield microscope has a DOF within −10 µm – 10 µm. The corresponding intensity image is shown on the left. The green bar indicates that our proposed method has a DOF from −80 µm to 80 µm. The intensity image is displayed on the right, and sectional grayscale profiles are also provided to validate the full-pitch resolution of 4.38 µm at all positions within the DOF.
Fig. 4.
Fig. 4. (a) shows a 3D visualization of the refractive index of Spirogyra. (b) shows the DPC images, phase distribution and refractive index distribution of Spirogyra at different depths. (c) shows the x-y plane slice at $Z = 0\mu m$ and orthogonal maximum-intensity projections of the Spirogyra. (d1) shows the refractive index distribution at the green dotted line in (c). (d2) is the axial refractive index distribution at the position of the red dotted line in (c).
Fig. 5.
Fig. 5. (a) shows the 3D visualization of the refractive index of paramecium. (b) shows the refractive index information at different depths at different points on the timeline. (c1) - (c2) are the enlarged refractive index distribution at the $Z = 0\mu m$ position at $t = 0.45s$ and $t = 0.62s$ and x-z, y-z slices at the position of the white dotted line. (d1) and (d2) are the refractive index distributions at the green straight line and red straight line in (c1) and (c2) respectively.

Tables (1)

Tables Icon

Table 1. Comparison of the number of intensity images required to reconstruct a 3D refractive index volume

Equations (14)

Equations on this page are rendered with MathJax. Learn more.

I D P C , n = I D n , 1 I D n , 2 I D n , 1 + I D n , 2 ,
I ( r )  =  | F 1 { F [ E ( r ) o ( r ) ] P ( u ) } | 2 d 2 u ,
F [ I ( r ) ]  =  I 0 F [ δ ( r ) ] + H a b s ( u ) F [ μ ( r ) ] + H p h i ( u ) F [ φ ( r ) ] ,
H p h i = W O T F ( u ) W O T F ( u ) .
{ L F u p ( x , y ) = υ = 1 N ρ = 1 N / 2 L F ( x , y , υ , ρ ) L F d o w n ( x , y ) = υ = 1 N ρ = N / 2 N L F ( x , y , υ , ρ ) L F l e f t ( x , y ) = υ = 1 N / 2 ρ = 1 N L F ( x , y , υ , ρ ) L F r i g h t ( x , y ) = υ = N / 2 N ρ = 1 N L F ( x , y , υ , ρ ) ,
U s e n s ( x s , y s ; χ ) = F 1 { F { U A S ( x m l a M r e l a y , y m l a M r e l a y ; χ ) T ( x m l a , y m l a ) } exp ( i k f m l 1 ( λ f X ) 2 ( λ f Y ) 2 ) } ,
φ tik ( r ) = F 1 { j H p h i j D P C ( u ) F [ I DPC , n , j ( r ) ] j | H p h i D P C j ( u ) | 2 + α } ,
R E I = r i r i s f m l / f f l ,
r i r i s = r m l f f l / f m l .
D F O V = 2 r i r i s f o b j / f t u b e ,
R N I P = ( λ / 2 N A m l ) ,
R x y = ( λ / 2 N A m l ) × ( f f l / f m l ) × ( 1 / M ) ,
R z = ( λ / 2 N A m l ) × ( f f l / f m l ) × ( 1 / tan ( θ ) ) × ( 1 / M 2 ) = 20.6 μ m ,
( M T 2 λ / N A 2 ) + [ P M T 2 / ( M N A ) ] ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.