Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Computational hyperspectral light-sheet microscopy

Open Access Open Access

Abstract

We describe a computational light-sheet microscope designed for hyperspectral acquisition at high spectral resolution. The fluorescence light emitted from the full field-of-view is focused along the entrance slit of an imaging spectrometer using a cylindrical lens. To acquire the spatial dimension orthogonal to the slit of the spectrometer, we propose to illuminate the specimen with a sequence of structured light patterns and to solve the image reconstruction problem. Beam shaping is obtained simply using a digital micromirror device in conjunction with a traditional selective plane illumination microscopy setup. We demonstrate the feasibility of this method and report the first results in vivo in hydra specimens labeled using two fluorophores.

© 2022 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Optical imaging has become an invaluable tool in the life sciences [13]. Among the variety of techniques now available, selective plane illumination microscopy (SPIM) allows fast $(x,y,z)$ imaging of fluorescent samples with reduced photobleaching. SPIM directly acquires the $(x,y)$ slice corresponding to a thin light sheet that illuminates the sample [4], while the third spatial dimension is scanned. Through promotion by the openSPIM project [5], many SPIM design variants are now available [69]. This enables the study of various samples, such as fly embryos [10], zebrafish embryos [11,12], and others [13]. A recent light-sheet set-up development promoted structured illumination to reduce photobleaching [1417] and/or provide better resolution with a non-Gaussian beam [3,18,19]. In all cases, SPIM exploits the fluorescence signal emitted by fluorophores that label specific structures in a specimen [20]. The study of multi-labeled specimens implies the need to unmix the fluorophores, which usually relies on optical filters. Methods based on filters can be simple yet effective; however, the light outside their pass band is lost (e.g., fluorophores with overlapping spectra cannot be unmixed [21]). Therefore, there is a need for three-dimensional (3D) imagers with hyperspectral capabilities that can exploit the full emission spectrum of a fluorescent sample.

Only few methods have been proposed to acquire the full spectrum over the whole field-of-view of a sample [2224]. The first demonstration of hyperspectral SPIM relied on a pair of mirror galvanometers that map an illumination line onto the entrance slit of a diffractive unit [22]. The hyperspectral slice is obtained by scanning the line within the detection plane. The technique can provide excellent spectral resolution, which allows the resolving of overlapping fluorophores with up to nanometer resolution. Image mapping spectrometry (IMS) is an alternative to line scanning. IMS maps a hyperspectral slice directly onto a two-dimensional (2D) sensor using multiple prisms. The IMS-SPIM set-up proposed by [25] allowed for fast acquisitions (i.e., in seconds). However, it requires custom-designed optical elements (i.e., a prism-lenslet array) and suffers from cross-talk and leakage between adjacent pupils. Another disadvantage is the compromise between the spectral and the spatial resolution, which leads to limited spectral resolution (e.g., tens of channels). Another option to scan a hyperspectral cube relies on the use of tunable hyperspectral fluorescence microscopy [24], with a lenslet array tunable snapshot imaging spectrometer. This solution also requires difficult optics to achieve a hyperspectral image, and can suffer from cross-talk and leakage between adjacent pupils. Other spectral methods designed for Raman imaging have intrinsically good spatial resolution over a wide field-of-view while they require multiple acquisitions and dedicated processing to get access to the spectral dimension [6,26]. For instance, the Fourier transform spectroscopy acquires the Fourier transform across the spectral dimension of a hypercube [6].

In this paper, we introduce the concept of computational light-sheet microscopy for high spectral resolution imaging. We propose to feed the fluorescence signal from the light sheet to the entrance slit of a spectrometer through a cylindrical lens. Therefore, the raw measurements benefit from the full spectral resolution of the spectrometer, while there is no need for moving parts contrary to [22]. To get access to the spatial dimension orthogonal to the slit, we propose to exploit a sequence of spatially encoded light sheets from which we formulate an image reconstruction problem. We first provide an overview of the proposed system, with a description of the basics of the concept, as well as the experimental set-up. Next, we describe how to measure experimentally the light patterns, and several strategies to incorporate this knowledge into the image reconstruction step. We finally characterize our system and provide reconstruction results from experimental data, including some in-vivo measurements.

2. System overview

2.1 Basic concept

We consider the acquisition of a four-dimensional (4D) hypercube $\Omega =(x,y,z,\lambda )$, where $(x,y,z)$ is the voxel position and $\lambda$ is the wavelength. Fluorescence light-sheet microscopy images the 3D volume $(x,y,z)$ by repeated acquisitions of 2D optical sections $(x,y)$ at multiple locations along the $z$-axis. In the remainder of the paper, we describe the acquisition of a single hyperspectral fluorescence section $f(x,y,\lambda )$, as the $z$-axis can be obtained by translating the sample.

In standard light-sheet microscopy, the light sheet is designed to be uniform in the $(x,y)$-plane. Here, we propose to modulate the illumination along the $x$-axis and to use a cylindrical lens to focus the light emitted by the sample onto the slit of a spectrometer, which is oriented along the $y$-axis (see Fig. 1). Measurements are repeated for a set of illumination patterns $\{p_k\}_{1\le k \le N_k}$, which leads to the set of measurements $\{m_k^{\ell }(y,\lambda )\}_{1\le k \le N_k}$ that can be modeled as

$$m_k(y,\lambda) = \int p_k(x)f(x,y,\lambda)\,{{\rm d}}x.$$

 figure: Fig. 1.

Fig. 1. Overview of the proposed computational hyperspectral light-sheet microscope. We consider the acquisition of the hyperspectral section $f(x,y,\lambda )$. We use multiple illumination patterns $\{p_k\}_{1\le k \le N_k}$ that are modulated along the $x$-axis. The fluorescence light emitted by the sample is focused onto the slit of a spectrometer, to provide the raw measurements $\{m_k^{\ell }(y,\lambda )\}_{1\le k \le N_k}$. Then, the hyperspectral section can be reconstructed numerically from the raw measurements.

Download Full Size | PDF

In a discrete setting, we denote $\mathbf {m}_\lambda ^{y} \in \mathbb {R}^{N_k}$ as the measurements obtained for all illumination patterns, at vertical position $y$ in the spectral channel $\lambda$, where $N_k$ is the number of illumination patterns. The discrete forward model reads

$$\mathbf{m}_\lambda^{y} = \mathbf{P} \ \mathbf{f}_\lambda^{y},$$
where $\mathbf {P} \in \mathbb {R}^{N_k \times N_x}$ contains the illumination patterns, and $\mathbf {f}_\lambda ^{y} \in \mathbb {R}^{N_x}$ is the fluorescence profile at vertical position $y$ in the spectral channel $\lambda$. Note that the number of pixels along the (compressed) $x$-axis is denoted by $N_x$. When $N_k = N_x$, and assuming that $\mathbf {P}$ is an orthogonal matrix (e.g., Hadamard, Fourier, or wavelets patterns), the optical profile can be reconstructed by
$$\mathbf{f}_\lambda^{y} = \mathbf{P}{{}^{\top}}\mathbf{m}_\lambda^{y}.$$

In the following, we consider the Hadamard matrix that maximizes the signal-to-noise ratio of the reconstruction [27].

2.2 Experimental setup

Our optical system is depicted in Fig. 2. It is fed by two continuous wave lasers that emit at $\lambda$ = 473 nm (MBL-FN-473; CNI, 100 $mW$) and $\lambda$ = 532 nm (MBL-FN-532; CNI, 100 $mW$) that are combined using a 50:50 (R:T) cube beamsplitter (BS013; Thorlabs). The lasers illuminate a digital micromirror device (DMD; V-7001; ViALUX). The DMD is divided into 1024$\times$768 micromirrors of 13.7 $\mathrm{\mu}$m, with a pitch of $\pm$ 12$^{\circ }$. The incident angle of the beam is fine-tuned to maximize the the first order of diffraction output power. To maximize the illumination of the active surface of the digital micromirror device, both beams are expanded four times using a two-lens telescope (LA1131-A f = 50 mm, LA1708-A f = 200 mm ; Thorlabs). Finally, we compress the beam reflected by the DMD twice, using another telescope (LA1509-A f = 100 mm, LA1131-A f = 50 mm; Thorlabs). Like in Lorente Mur et al. [28], Hadamard patterns were chosen for the illumination. As the patterns have positive and negative values, we use the differential approach described by [29]. Therefore, we split the negative and the positive parts of each pattern, and measure them sequentially. Then, we subtract the negative part from the positive part of the measurements.

 figure: Fig. 2.

Fig. 2. (a) Schematic representation of the structured light-sheet illumination. (b) Scheme of the compressive hyperspectral selective plane illumination microscopy (SPIM) demonstrator (Note: the SPIM setup in the red box has not the same orientation as the other components) . The beam from the lasers impinges on a digital micromirror device that allows spatial light modulation. One order of the reflected beam is collected and resized, and then the beam is collected and focused along one direction into the specimen by the SPIM set-up. The fluorescence signal is collected by an objective and refocused onto the slit of the spectrometer using the cylindrical lens. It is also possible to perform classical imaging by refocusing the light directly onto another camera.

Download Full Size | PDF

The modulated beam is fed to a modified OpenSPIM [5]. The main difference with the openSPIM set-up is the removal of the first beam expander, and the use of different objectives. Here, we use an N20X-PF Olympus objective (0.5 NA) for excitation of the sample, and an RMS4X Olympus objective (0.1 NA) for collection of the fluorescence light. To remove any diffuse laser light and collect only fluorescence light from the sample, two notch filters (ZET473NF, ZET532NF; Chroma) are placed after the collection objective. Then, the fluorescence light can be directed towards either a direct imaging arm or a spectroscopic module.

The direct imaging arm acquires grayscale fluorescence images with a camera (Zyla-5.5 Andor), which can be modeled by

$$g_k(x,y) = \int \eta(\lambda)p_k(x)f(x,y,\lambda) \,{{\rm d}}\lambda,$$
where $\eta (\lambda )$ represents the spectral response of the imaging sensor. The interest of the imaging arm is two-fold. First, the direct image obtained in the absence of light modulation, i.e., when $p_k(x)=1$, corresponds to the traditional grayscale light-sheet image. Secondly, choosing a homogeneous sample, direct images correspond to the illumination patterns as $g_k(x,y) \propto p_k(x)$.

The spectroscope arm includes an achromatic cylindrical lens (ACY254-075-A; Thorlabs) that focuses the light-sheet plane onto the slit of a Czerny-Turner imaging spectrometer (Shamrock 500i; Andor) using a 300 lines/mm grating (SR5-GRT-0300-0422; Andor), followed by another camera (Zyla-5.5; Andor), with the result in a measured band-width of 108 nm. The Czerny-Turner spectrometer does not induce aberration or magnification along the y-axis. The focal of the cylindrical lens has been chosen to avoid clipping of the beam by the first toroidal mirror of the Czerny-Turner spectrometer. We tune the entrance slit of the spectrometer to maximum aperture to maximize the signal, which results in a spectral resolution of 2 nm. This choice also ensures that the slit does not clip the field-of-view. The two arms of the set-up are calibrated to observe the same field-of-view.

3. Methods

3.1 Experimental forward models

The actual light patterns can differ significantly from the target patterns that are uploaded onto the digital micromirror device (compare Fig. 3(a), (e) to Fig. 3(b), (f), respectively). Therefore, we introduce a more realistic forward operator $\mathbf {P}^{y} \in \mathbb {R}^{N_k\times Q_x}$ that can be built from experimental measurements as

$$\mathbf{P}^{y} = [\mathbf{g}^{y}_1 \ldots \mathbf{g}^{y}_{N_k}]{{}^{\top}},$$
where $\mathbf {g}^{y}_k\in \mathbb {R}^{Q_x}$ is the row at position $y$ in the direct image obtained by illuminating a homogeneous fluorescence solution with the $k$th pattern. As the modulation profiles vary slowly across the $y$-axis, we also consider the average forward model
$$\bar{\mathbf{P}} = \frac{1}{N_y}\sum_{y=1}^{N_y} \frac{1}{\beta^{y}}\mathbf{P}^{y}.$$
where $\beta ^{y} = \max {(|\mathbf {P}^{y}|)}$ is a normalization factor. In both cases, the experimental forward matrix is not orthogonal, and the reconstruction as given by Eq. (3) does not apply.

 figure: Fig. 3.

Fig. 3. Images measured by the SPIM arm for two Hadamard patterns using a blue laser (473 nm). The scale bar represents 100 ${\mu}$m and the intensity is normalized. (a, e) Target patterns uploaded onto the DMD. (b, f) Differential experimental patterns measured in a coumarin solution. (c, g) Positive part of the experimental patterns. (d, h) Negative part of the experimental patterns. (i) Target forward model $\mathbf {P}$. (j) Mean experimental model $\bar {\mathbf {P}}$ as defined in Eq. (6).

Download Full Size | PDF

3.2 Regularized image reconstruction

We reconstruct the fluorescence profiles at different positions $y$ and spectral channel $\lambda$ independently, by solving the Tikhonov problem

$$\min_{\mathbf{f}_\lambda^{y}}{\|\mathbf{m}_\lambda^{y} -\mathbf{P}\mathbf{f}_\lambda^{y}\|^{2} + \alpha \|\mathbf{f}_\lambda^{y}\|^{2} },$$
where $\alpha$ is the regularization parameter that sets the trade-off between data fidelity and prior information, and the forward matrix $\mathbf {P}\in \mathbb {R}^{N_k\times Q_x}$ is given by either Eq. (5) or Eq. (6). This problem provides the following analytical solution
$$\mathbf{f}_\lambda^{y} = \left(\mathbf{P}{{}^{\top}}\mathbf{P} + \alpha\,\mathbf{I}\right)^{{-}1}\mathbf{P}{{}^{\top}} \mathbf{m}_\lambda^{y}.$$

The raw data acquired through the spectral arm extend over a spectral range with $N_\lambda$ = 2,560 spectral channels corresponding to the number of pixels in our sensor. For visualization or data analysis, it may be convenient to plot the results in some particular spectral bins. After acquisition, it is possible to reconstruct the binned image $\mathbf {f}^{y}_\Lambda = \sum _{\lambda \in \Lambda } \mathbf {f}_\lambda ^{y}$ directly from the binned measurements $\mathbf {m}^{y}_\Lambda = \sum _{\lambda \in \Lambda } \mathbf {m}_\lambda ^{y}$, where $\Lambda$ represents the set of spectral channels that are merged into the same spectral bin. By the linearity of Eq. (8) with respect to $\lambda$, the binned image is also given as $\mathbf {f}_\Lambda ^{y} = \left (\mathbf {P}{{}^{\top }}\mathbf {P} + \alpha \,\mathbf {I}\right )^{-1} \mathbf {P}{{}^{\top }} \mathbf {m}_\Lambda ^{y}$. The case where the spectral bin is chosen to cover the full spectral range, i.e., $\Lambda = \{\lambda _1, \ldots, \lambda _{N_\lambda }\}$, corresponds to grayscale imaging.

3.3 Samples

We consider three different samples. The first sample is a homogeneous coumarin solution that is used to measure the light patterns. The second sample is a bead solution that is used to evaluate the spatial resolution of our device. We consider red fluorescent microspheres (FluoSpheres) with diameter 300 nm, and yellow-green fluorescent microspheres (FluoSpheres) with diameter 300 nm, both from Thermofisher. Finally, we consider a hydra labeled with two fluorophores. The shell of the hydra is labeled with Superfolder green fluorescent protein (GFP) ($\lambda ^{\rm ex} = 485$ nm, $\lambda ^{\rm em} = 510$ nm), and the inner tissue is labeled with DsRed2 ($\lambda ^{\rm ex} = 561$ nm, $\lambda ^{\rm em} = 587$ nm). Both of these fluorophores are short-lived biolabels that are naturally expressed by the specimens. The hydras are anesthetized and embedded in agarose, to keep the production of fluorophore going for long enough to image them even if their fluorescence yields decrease under irradiation. The acquisition time of the sample are listed in Table 1.

Tables Icon

Table 1. Acquisition parameters of the samples, for Nk=64

4. Results and discussion

4.1 Beam shaping and modulation patterns

To evaluate the quality of the patterns, we imaged a fluorescent dye (coumarin 314; Sigma-Aldrich) in solution with the direct imaging arm. In Fig. 3, we provide some of the images that we acquired. In particular, Fig. 3(a), f show two target Hadamard patterns uploaded onto the DMD, while Fig. 3(b), (g) show the resulting images obtained in the coumarin solution. The light distribution within the illumination sheet differs significantly from the target Hadamard patterns that we upload onto the DMD (compare Fig. 3(a) to Fig. 3(b), and Fig. 3(e) to Fig. 3(f)). This can be attributed to the long light propagation path (e.g., $\sim$80 cm) between the DMD and the light sheet.

We also note that the actual patterns vary across the $y$-axis. First, the light intensity decreases from top to bottom. As the top of the image corresponds to the entrance plane of the laser in the coumarin cuvette, the decrease can be attributed to light attenuation (e.g., an exponentially decaying Beer-Lambert law). Secondly, the modulation patterns are slightly divergent, i.e., the modulation is concentrated on a wider range on the exit face that on the entrance face of the cuvette. While light attenuation can easily be compensated for by fitting or simply normalizing the patterns, the divergence and tilt of the patterns cannot, which motivates the use of the more accurate model given by Eq. (5), rather than the approximate models, such as Eq. (6).

4.2 Influence of the forward model

We evaluate here the image reconstruction that results from the use of three forward models. In Fig. 4, we show the images obtained using the target Hadamard matrix and the experimental forward models given by Eqs. (5) and (6). As we focus on the spatial resolution of the reconstructions, we integrate the measurements across the spectral axis, i.e., $\mathbf {m} = \sum _\lambda \mathbf {m}_\lambda$, which leads to grayscale images $\mathbf {f} = \sum _\lambda \mathbf {f}_\lambda$ that can be compared to the image taken from the direct imaging arm that we consider as the ground-truth.

 figure: Fig. 4.

Fig. 4. Reconstructions using different forward models. The scale bar represents 100 ${\mu}$m and the intensity is normalized. (a) Ground-truth image from the direct imaging arm. Reconstruction using the target Hadamard patterns (b), the mean experimental patterns of Eq. (6) (c), and the experimental patterns of Eq. (5) (d). The reconstructions are obtained from Eq. (3) setting $\alpha = 10^{-2}$. For all of the images, we set all negative values to 0, which removes most of the background noise and some of the reconstruction artifacts. For visualization, we also sum the raw measurement across the spectral axis (i.e., $\Lambda = \{\lambda _1, \ldots, \lambda _{N_\lambda }\}$, (see Section 3.2)

Download Full Size | PDF

The image reconstructed from the target Hadamard, displayed in Fig. 4(b) suffers from strong vertical artifacts and severe blurring, which makes it difficult to identify the shape of the object. Comparing Fig. 4(b) to Fig. 4(c), we observe that by considering the mean experimental model of Eq. (3), this significantly improves the reconstruction quality; the shape of the specimen is well recovered, and the vertical artifacts are removed. While the previous models hypothesized that the light patterns remained unchanged while propagating along the $y$-axis, this last experimental model takes into account the variations discussed in Section 4.1. The reconstruction in Fig. 4(d), which relies on the experimental pattern of Eq. (3), provides the best visual reconstruction, with sharper details than in Fig. 4(c). Increasing the regularization parameter reduces noise but also smooths out the fine structures. We selected $\alpha = 10^{2}$ by visual inspection, as a good compromise between noise and resolution.

4.3 Spatial resolution

To characterize the spatial resolution of our device, we imaged the bead solutions described in Section 3.3. The image obtained using the imaging arm of our device is shown in Fig. 5(a), and the one using the hyperspectral compressive arm in Fig. 5(b). As in Section 4.2, we integrated the measurements over the full spectral range, i.e., $\mathbf {m} = \sum _\lambda \mathbf {m}_\lambda$. We observe several beads on both the direct SPIM image and the hyperspectral arm. We selected two spots among those with the lowest spatial extent, and indicate their correspondence in both images with arrows. In Fig. 5(c), (d), we pick the centre of each spot and plot the profiles to estimate the $x$-axis and $y$-axis resolutions. We evaluate the spatial resolution of the system by measuring the full width at half maximum of the profiles. In the direct SPIM image, the point spread functions are well represented by a 2D isotropic Gaussian function. We measure $\sigma _x^{\rm A} = 4$ ${\mu}$m, $\sigma _y^{\rm A} = 4$ ${\mu}$m, $\sigma _x^{\rm B} = 4$ ${\mu}$m, and $\sigma _y^{\rm B} = 4$ ${\mu}$m. The point spread function of the hyperspectral device is more elongated along the $y$-axis than along the $x$-axis. Along the $y$-axis, we observe a Gaussian-like point-spread function, while the shape of the point-spread function presents side lobes that take positive and negative values across the $x$-axis (see Fig. 5(c), (d)). We measure the following spatial resolutions: $\sigma _x^{\rm A} = 52$ ${\mu}$m, $\sigma _y^{\rm A} = 4$ ${\mu}$m, $\sigma _x^{\rm B} = 63$ ${\mu}$m, and $\sigma _y^{\rm B} = 4$ ${\mu}$m. While the spatial resolution of the hyperspectral and direct arms are similar for the $y$-axis, the spatial resolution of the hyperspectral are reduced 10-fold for the $x$-axis. This limited $x$-axis spatial resolution of the hyperspectral arm is due to the number of patterns. Note that only $N_k = 128$ patterns are acquired on the hyperspectral arm, to be compared to the $Q_x = 2560$ pixels on the direct arm. Higher spatial resolution can be achieved by increasing the number of patterns. Note that we acquire the full fluorescence spectrum, while the direct arm only acquires a grayscale image. In Table 2, we evaluate the spatial resolution of our method for different number of patterns $N_K$. In theory, the resolution is proportional to the number of patterns and we observe that the resolution improves by increasing the number of patterns from 8 to 64; however, we obtain the same resolution with 64 and 128 patterns.

 figure: Fig. 5.

Fig. 5. Imaging of fluorescent beads to evaluate the spatial resolution. The scale bar represents 100 ${\mu}$m. (a) Image acquired using the direct imaging arm (traditional SPIM) . (b) Image reconstructed using the hyperspectral arm (hyperspectral SPIM) and summed across the spectral axis. The correspondence of two spots is indicated by arrows; the red and green boxes show spots A and B magnified three times. (c) Bead intensity profile along the $x$-axis and $y$-axis obtained with the direct imaging arm. (d) Bead intensity profile along $x$-axis and $y$-axis obtained with the hyperspectral arm).

Download Full Size | PDF

Tables Icon

Table 2. Full width at half maximum of the bead "A" along the $x$- and $y$-axis for different number of patterns

4.4 Living organism experiment

Finally, we image the tail of a transgenic hydra (see Fig. 6(a)). The skin of the hydra expressed Superfolder GFP ($\lambda ^{\rm ex} = 485$ nm, $\lambda ^{\rm em} = 510$ nm), and the inner tissues, DsRed2 ($\lambda ^{\rm ex} = 561$ nm, $\lambda ^{\rm em} = 587$ nm). With the direct imaging arm, we measure the traditional SPIM image that is shown in Fig. 6(b). To obtain the bicolor image of Fig. 6(d), we superimposed the image in the range $\Lambda _{\rm green} = [493,527]$ nm and $\Lambda _{\rm red} = [576,601]$ nm, which are centered around the emission wavelength of each of the fluorophores, as shown in Fig. 6(c). As described in Section 3.3, the skin of the hydra is labeled with GFP and the inner tissues are labeled with DsRed2, which is also revealed by the bi-color image that we produce with the hyper-spectral measurement. Despite the limited resolution along the $x$-axis, we have been able to differentiate the inner tissue from the skin through the information contained in the spectral dimensions. To the best of our knowledge, this is the first hyperspectral SPIM image of a living organism obtained with a computational method. Note that the fluorescence signal from the DsRed2 is dimmer than that from the GFP. This is why the inner tissues are less visible in the traditional SPIM image than in the hyperspectral SPIM image, where we normalize separately DsRed2 and GFP.

 figure: Fig. 6.

Fig. 6. Comparison of conventional and hyperspectral imaging of a two-color hydra specimen labeled with Superfolder GFP ($\lambda ^{\rm ex} = 485$ nm, $\lambda ^{\rm em} = 510$ nm) and DsRed2 ($\lambda ^{\rm ex} = 561$ nm, $\lambda ^{\rm em} = 587$ nm). (a) Image of the sample. We show the tail of the hydra that was studied in the cyan box. Scale bar 1 mm. (b) Traditional SPIM from the direct image arm. Scale bar, 100 ${\mu}$m. (c) Spectra of one element of the shell of the hydra (zone 1) and of one element of the inside of the hydra (zone 2). We binned the raw measurements to get 128 channels. (d) Hyperspectral SPIM (green channel range: $\Lambda _{\rm green} = [493,527]$ nm; red channel range: $\Lambda _{\rm red} = [576,601]$ nm). Scale bar, $100{\mu}$m.

Download Full Size | PDF

4.5 Limitations of the study

The spectral bandwidth of a single acquisition is limited by the grating of the spectrometer and the size of the CCD matrix (i.e., around 108 nm in this set-up with a 300 lines/mm grating). Thus, if we want to unmix more fluorophores, we might need to change the grating of the spectrometer to increase the spectral bandwidth. Of course sequential acquisition of the spectral bandwidth for various positions of the spectrometer is possible, but it would increase the acquisition time, which is not wanted. One limitation of the current work is the long acquisition time, as shown in Table 1, which can mainly be attributed to the low power of the excitation beam. Another limitation is the resolution across one of the two spatial axes. While higher spatial resolutions require the number of patterns to be increased, this also leads to increased acquisition times. Nevertheless, as shown in Fig. 4(b), the experimental patterns may deviate from the target Hadamard patterns and the spatial resolution not increase linearly with the number of patterns, as theoretically expected. Therefore, the generation of the patterns (i.e, the condition number of the forward operator) must be improved to increase the spatial resolution. Our future work will focus on this issue. Moreover, we considered a straightforward reconstruction approach based on Tikhonov regularization. While our first results are promising, our problem might benefit from the recent advances in image reconstruction based on deep learning. In particular, this can include prior knowledge about the solution, such as spatial redundancy across the $y$-axis. Finally, acquisition from living organisms can be subject to motion artifacts. Indeed, one of downsides of the fixation method used is that the hydra might move slightly during the acquisition, which would produce a possible blur during image reconstruction. This can be improved by reducing the acquisition time and/or taking into account motion during the reconstruction.

5. Conclusion

We describe a computational hyperspectral single-plane illumination microscope. Our system relies on a traditional single-plane illumination microscope coupled with a digital micromirror device to obtain structured light within the illumination sheet. The fluorescence signal from the light sheet is fed to the entrance slit of a spectrometer through a cylindrical lens. Only one of the two spatial dimensions of the hypercube under acquisition is encoded, so that the raw measurements benefit from the full spectral resolution of the spectrometer. Provided that experimental measurements of the light patterns are available, we demonstrate that a simple reconstruction algorithm can recover the encoded spatial dimension. We apply the methodology to a hydra labeled with two fluorophores. From the full spectrum obtained in each slice pixel, we can easily distinguish the two structures labeled with the two fluorophores. The main limitation is the limited spatial resolution along one axis. In future work, we will address this issue by improving the pattern generation and consider deep reconstruction algorithms.

Funding

Agence Nationale de la Recherche (ANR-17-CE19-0003); LABEX PRIMES (ANR-11-LABX-0063)).

Acknowledgments

The authors thank Dr O. Cochet-Escartin (ILM, UCB Lyon) for providing hydra specimens. [We thank an anonymous reviewer for critically reading the manuscript and suggesting substantial improvements.]

Disclosures

The authors declare that there are no conflicts of interest related to this article.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. S. G. Megason and S. E. Fraser, “Digitizing life at the level of the cell: high-performance laser-scanning microscopy and image analysis for in toto imaging of development,” Mech. Dev. 120(11), 1407–1420 (2003). [CrossRef]  

2. V. Ntziachristos, “Going deeper than microscopy: the optical imaging frontier in biology,” Nat. Methods 7(8), 603–614 (2010). [CrossRef]  

3. J. Mertz, “Optical sectioning microscopy with planar or structured illumination,” Nat. Methods 8(10), 811–819 (2011). [CrossRef]  

4. O. E. Olarte, J. Andilla, E. J. Gualda, and P. Loza-Alvarez, “Light-sheet microscopy: a tutorial,” Adv. Opt. Photonics 10(1), 111–179 (2018). [CrossRef]  

5. P. Pitrone, J. Schindelin, L. Stuyvenberg, S. Preibisch, M. Weber, K. Eliceiri, J. Huisken, and P. Tomancak, “Openspim: An open-access light-sheet microscopy platform,” Nat. Methods 10(7), 598–599 (2013). [CrossRef]  

6. W. Müller, M. Kielhorn, M. Schmitt, J. Popp, and R. Heintzmann, “Light sheet raman micro-spectroscopy,” Optica 3(4), 452–457 (2016). [CrossRef]  

7. L. Gao, “Extend the field of view of selective plan illumination microscopy by tiling the excitation light sheet,” Opt. Express 23(5), 6102–6111 (2015). [CrossRef]  

8. Q. Fu, B. L. Martin, D. Q. Matus, and L. Gao, “Imaging multicellular specimens with real-time optimized tiling light-sheet selective plane illumination microscopy,” Nat. Commun. 7(1), 11088 (2016). [CrossRef]  

9. D. Xu, W. Zhou, and L. Peng, “Three-dimensional live multi-label light-sheet imaging with synchronous excitation-multiplexed structured illumination,” Opt. Express 25(25), 31159–31173 (2017). [CrossRef]  

10. J. Huisken, J. Swoger, F. Del Bene, J. Wittbrodt, and E. H. K. Stelzer, “Optical sectioning deep inside live embryos by selective plane illumination microscopy,” Science 305(5686), 1007–1009 (2004). [CrossRef]  

11. P. A. Santi, “Light sheet fluorescence microscopy: A review,” J. Histochem. Cytochem. 59(2), 129–138 (2011). [CrossRef]  

12. M. Weber and J. Huisken, “Light sheet microscopy for real-time developmental biology,” Curr. Opin. Genet. Dev. 21(5), 566–572 (2011). [CrossRef]  

13. P. G. Pitrone, J. Schindelin, K. W. Eliceiri, and P. Tomancak, “OpenSPIM: A do-it-yourself open access light sheet fluorescence microscope,” Microscopy and Analysis 26, 7–11 (2015).

14. R. A. Hoebe, C. H. Van Oven, T. W. J. Gadella, P. B. Dhonukshe, C. J. F. Van Noorden, and E. M. M. Manders, “Controlled light-exposure microscopy reduces photobleaching and phototoxicity in fluorescence live-cell imaging,” Nat. Biotechnol. 25(2), 249–253 (2007). [CrossRef]  

15. N. Chakrova, A. S. Canton, C. Danelon, S. Stallinga, and B. Rieger, “Adaptive illumination reduces photobleaching in structured illumination microscopy,” Biomed. Opt. Express 7(10), 4263–4274 (2016). [CrossRef]  

16. G. Calisesi, M. Castriotta, A. Candeo, A. Pistocchi, C. D’Andrea, G. Valentini, A. Farina, and A. Bassi, “Spatially modulated illumination allows for light sheet fluorescence microscopy with an incoherent source and compressive sensing,” Biomed. Opt. Express 10(11), 5776–5788 (2019). [CrossRef]  

17. R. M. Power and J. Huisken, “Adaptable, illumination patterning light sheet microscopy,” Sci. Rep. 8(1), 9615 (2018). [CrossRef]  

18. R. Li, X. Zhou, D. Wu, T. Peng, Y. Yang, M. Lei, X. Yu, B. Yao, and T. Ye, “Selective plane illumination microscopy with structured illumination based on spatial light modulators,” in SPIE BiOS, (International Society for Optics and Photonics, 2014), pp. 89491S–89491S–5.

19. M. Aakhte, E. A. Akhlaghi, and H.-A. J. Müller, “SSPIM: a beam shaping toolbox for structured selective plane illumination microscopy,” Sci. Rep. 8(1), 10067 (2018). [CrossRef]  

20. P. Mahou, J. Vermot, E. Beaurepaire, and W. Supatto, “Multicolor two-photon light-sheet microscopy,” Nat. Methods 11(6), 600–601 (2014). [CrossRef]  

21. M. Dickinson, G. Bearman, S. Tille, R. Lansford, and S. Fraser, “Multi-spectral imaging and linear unmixing add a whole new dimension to laser scanning fluorescence microscopy,” BioTechniques 31(6), 1272–1278 (2001). [CrossRef]  

22. W. Jahr, B. Schmid, C. Schmied, F. Fahrbach, and J. Huisken, “Hyperspectral light sheet microscopy,” Nat. Commun. 6(1), 7990 (2015). [CrossRef]  

23. Z. Lavagnino, J. Dwight, A. Ustione, T.-U. Nguyen, T. Tkaczyk, and D. Piston, “Snapshot hyperspectral light-sheet imaging of signal transduction in live pancreatic islets,” Biophys. J. 111(2), 409–417 (2016). [CrossRef]  

24. J. G. Dwight and T. S. Tkaczyk, “Lenslet array tunable snapshot imaging spectrometer (latis) for hyperspectral fluorescence microscopy,” Biomed. Opt. Express 8(3), 1950–1964 (2017). [CrossRef]  

25. Z. Lavagnino, G. Sancataldo, M. d’Amora, P. Follert, D. De Pietri Tonelli, A. Diaspro, and F. Cella Zanacchi, “4d (x-y-z-t) imaging of thick biological samples by means of two-photon inverted selective plane illumination microscopy (2pe-iSPIM),” Sci. Rep. 6(1), 23923 (2016). [CrossRef]  

26. I. Rocha-Mendoza, J. Licea-Rodriguez, M. Marro, O. E. Olarte, M. Plata-Sanchez, and P. Loza-Alvarez, “Rapid spontaneous raman light sheet microscopy using cw-lasers and tunable filters,” Biomed. Opt. Express 6(9), 3449–3461 (2015). [CrossRef]  

27. E. D. Nelson and M. L. Fredman, “Hadamard Spectroscopy,” J. Opt. Soc. Am. 60(12), 1664–1669 (1970). [CrossRef]  

28. A. Lorente Mur, M. Ochoa, J. E. Cohen, X. Intes, and N. Ducros, “Handling negative patterns for fast single-pixel lifetime imaging,” in Molecular-Guided Surgery: Molecules, Devices, and Applications V, vol. 10862 (International Society for Optics and Photonics, 2019), p. 108620A.

29. A. Lorente Mur, F. Peyrin, and N. Ducros, “Recurrent Neural Networks for Compressive Video Reconstruction,” in 2020 IEEE 17th International Symposium on Biomedical Imaging (ISBI), (2020), pp. 1651–1654.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1. Overview of the proposed computational hyperspectral light-sheet microscope. We consider the acquisition of the hyperspectral section $f(x,y,\lambda )$. We use multiple illumination patterns $\{p_k\}_{1\le k \le N_k}$ that are modulated along the $x$-axis. The fluorescence light emitted by the sample is focused onto the slit of a spectrometer, to provide the raw measurements $\{m_k^{\ell }(y,\lambda )\}_{1\le k \le N_k}$. Then, the hyperspectral section can be reconstructed numerically from the raw measurements.
Fig. 2.
Fig. 2. (a) Schematic representation of the structured light-sheet illumination. (b) Scheme of the compressive hyperspectral selective plane illumination microscopy (SPIM) demonstrator (Note: the SPIM setup in the red box has not the same orientation as the other components) . The beam from the lasers impinges on a digital micromirror device that allows spatial light modulation. One order of the reflected beam is collected and resized, and then the beam is collected and focused along one direction into the specimen by the SPIM set-up. The fluorescence signal is collected by an objective and refocused onto the slit of the spectrometer using the cylindrical lens. It is also possible to perform classical imaging by refocusing the light directly onto another camera.
Fig. 3.
Fig. 3. Images measured by the SPIM arm for two Hadamard patterns using a blue laser (473 nm). The scale bar represents 100 ${\mu}$m and the intensity is normalized. (a, e) Target patterns uploaded onto the DMD. (b, f) Differential experimental patterns measured in a coumarin solution. (c, g) Positive part of the experimental patterns. (d, h) Negative part of the experimental patterns. (i) Target forward model $\mathbf {P}$. (j) Mean experimental model $\bar {\mathbf {P}}$ as defined in Eq. (6).
Fig. 4.
Fig. 4. Reconstructions using different forward models. The scale bar represents 100 ${\mu}$m and the intensity is normalized. (a) Ground-truth image from the direct imaging arm. Reconstruction using the target Hadamard patterns (b), the mean experimental patterns of Eq. (6) (c), and the experimental patterns of Eq. (5) (d). The reconstructions are obtained from Eq. (3) setting $\alpha = 10^{-2}$. For all of the images, we set all negative values to 0, which removes most of the background noise and some of the reconstruction artifacts. For visualization, we also sum the raw measurement across the spectral axis (i.e., $\Lambda = \{\lambda _1, \ldots, \lambda _{N_\lambda }\}$, (see Section 3.2)
Fig. 5.
Fig. 5. Imaging of fluorescent beads to evaluate the spatial resolution. The scale bar represents 100 ${\mu}$m. (a) Image acquired using the direct imaging arm (traditional SPIM) . (b) Image reconstructed using the hyperspectral arm (hyperspectral SPIM) and summed across the spectral axis. The correspondence of two spots is indicated by arrows; the red and green boxes show spots A and B magnified three times. (c) Bead intensity profile along the $x$-axis and $y$-axis obtained with the direct imaging arm. (d) Bead intensity profile along $x$-axis and $y$-axis obtained with the hyperspectral arm).
Fig. 6.
Fig. 6. Comparison of conventional and hyperspectral imaging of a two-color hydra specimen labeled with Superfolder GFP ($\lambda ^{\rm ex} = 485$ nm, $\lambda ^{\rm em} = 510$ nm) and DsRed2 ($\lambda ^{\rm ex} = 561$ nm, $\lambda ^{\rm em} = 587$ nm). (a) Image of the sample. We show the tail of the hydra that was studied in the cyan box. Scale bar 1 mm. (b) Traditional SPIM from the direct image arm. Scale bar, 100 ${\mu}$m. (c) Spectra of one element of the shell of the hydra (zone 1) and of one element of the inside of the hydra (zone 2). We binned the raw measurements to get 128 channels. (d) Hyperspectral SPIM (green channel range: $\Lambda _{\rm green} = [493,527]$ nm; red channel range: $\Lambda _{\rm red} = [576,601]$ nm). Scale bar, $100{\mu}$m.

Tables (2)

Tables Icon

Table 1. Acquisition parameters of the samples, for Nk=64

Tables Icon

Table 2. Full width at half maximum of the bead "A" along the x - and y -axis for different number of patterns

Equations (8)

Equations on this page are rendered with MathJax. Learn more.

m k ( y , λ ) = p k ( x ) f ( x , y , λ ) d x .
m λ y = P   f λ y ,
f λ y = P m λ y .
g k ( x , y ) = η ( λ ) p k ( x ) f ( x , y , λ ) d λ ,
P y = [ g 1 y g N k y ] ,
P ¯ = 1 N y y = 1 N y 1 β y P y .
min f λ y m λ y P f λ y 2 + α f λ y 2 ,
f λ y = ( P P + α I ) 1 P m λ y .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.