Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

High-resolution 3D optical microscopy inside the beating zebrafish heart using prospective optical gating

Open Access Open Access

Abstract

3D fluorescence imaging is a fundamental tool in the study of functional and developmental biology, but effective imaging is particularly difficult in moving structures such as the beating heart. We have developed a non-invasive real-time optical gating system that is able to exploit the periodic nature of the motion to acquire high resolution 3D images of the normally-beating zebrafish heart without any unnecessary exposure of the sample to harmful excitation light. In order for the image stack to be artefact-free, it is essential to use a synchronization source that is invariant as the sample is scanned in 3D. We therefore describe a scheme whereby fluorescence image slices are scanned through the sample while a brightfield camera sharing the same objective lens is maintained at a fixed focus, with correction of sample drift also included. This enables us to maintain, throughout an extended 3D volume, the same standard of synchronization we have previously demonstrated in and near a single 2D plane. Thus we are able image the complete beating zebrafish heart exactly as if the heart had been artificially stopped, but sidestepping this undesirable interference with the heart and instead allowing the heart to beat as normal.

© 2012 Optical Society of America

1. Introduction

Fluorescence imaging is a widely adopted technique for imaging the development of the heart in animal models such as zebrafish (Danio rerio), and more recently this technique has been used in high-framerate imaging of cardiac dynamics in the living, beating heart [1, 2]. This methodology has enabled new insights into the growth, structure, development and dynamics of the heart and other organs [25]. 3D fluorescence imaging is particularly challenging in moving structures such as the beating heart as it is, at best, only possible to acquire a single 2D image section at a time, with multiple images being required to build up the full 3D image, and some methods such as confocal imaging requiring point or line scans to build up the 3D image. While some imaging modalities such as optical projection tomography (OPT) or magnetic resonance imaging (MRI) may use projections of the 3D dataset that are more complex than simple xy planes, they still require the acquisition of multiple planes in order to reconstruct the desired volumetric dataset.

In order to acquire a consistent 3D image, all these 2D images must be acquired at the same point in the periodic cycle of the heartbeat. This is an issue that has attracted attention in a wide range of imaging modalities, including magnetic resonance imaging [68], confocal microscopy [912], optical coherence tomography (OCT) [13] and selective plane illumination microscopy (SPIM) [14]. Two categories of technique can be used to address this problem: retrospective gating and prospective gating.

In retrospective gating a large number of images are acquired at high speed for each 2D plane, over the course of multiple heart cycles. Computationally demanding post-acquisition analysis is then required to annotate each 2D image with a phase (timepoint in the heartbeat). It is then necessary to identify the relative temporal shifts between adjacent 2D planes [9], although in some imaging modalities it may be possible to identify a feature that is invariant between all the 2D planes (but which varies periodically in time), as has been described for OCT imaging in [13]. Throughout this image acquisition the tissue is exposed to the illumination light even when most of the images will be rejected, and thus photobleaching and phototoxic events will be taking place with no relevant data ultimately collected.

In prospective gating a reference signal, which could for example be an electrocardiogram (ECG), is used to identify exactly when the heart will be in a desired position in its cycle [12]. Real-time analysis of this signal allows image acquisition to be triggered when the heart is in that desired position, with no unnecessary images being acquired nor excessive illumination. In our case this signal is obtained from real-time computer analysis of brightfield images of the heart, a concept that we first introduced in [14]. We will present a refocusing scheme which ensures that this reference signal remains consistent as the sample is scanned in z, ensuring consistent and artefact-free synchronization in three dimensions.

Here we will describe the application of our non-invasive optical gating technique to 3D imaging of the beating zebrafish heart, resulting in a novel imaging modality that is eminently suitable for high-throughput imaging and developmental analyses. We demonstrate that we can obtain high resolution 3D images of the inside of a naturally-beating zebrafish heart, images obtained with a combined spatial and temporal resolution that is unique to our technique. We discuss quantitatively the effects of synchronization accuracy on effective spatial resolution of the resultant datasets.

2. Method

For fluorescence imaging we use a SPIM microscope based on the designs in [15,16]. SPIM is an attractive technique for in vivo imaging due to its widefield image acquisition, inherent depth sectioning capability and low sample exposure to laser excitation light [4]. We have previously reported our real-time heart synchronization algorithm in the context of synchronized 2D image acquisition in [14]. This uses real-time analysis of brightfield images to achieve prospective gating of fluorescence images.

Briefly, the synchronization algorithm operates as follows. As introduced in [14], we define a metric value ∂fg representing the result of a comparison between the most recently received frame f and another frame g. We use the simple sum of absolute differences (SAD) metric, which represents a good combination of discriminatory power and speed of execution. At the start of the experiment the period of the heartbeat is established, and one period’s worth of frames are retained as reference frames. Every subsequently-acquired frame f is compared against the set of reference frames, yielding a sequence of metric values ∂fg. The minimum value in this sequence is interpolated to sub-frame accuracy in order to record a phase for frame f. The phase history is then extrapolated forwards in time in order to predict when the heart will next be in the desired position. An electrical trigger signal will be sent at this predicted time. Figure 1 shows a schematic diagram of the complete system.

 figure: Fig. 1

Fig. 1 Schematic diagram of synchronization system. Brightfield images are compared against a set of previously-acquired reference frames, and a phase assigned to each received image (“phase recovery”). A fit is then performed on this data in order to predict the time at which the heart will be in the particular desired position (“forward prediction”). This predicted time is transmitted to the timing controller which generates electrical trigger signals for the fluorescence camera and the excitation laser.

Download Full Size | PDF

This technique forms the foundations of the approach and results described in the present paper; we in the following section we will describe the specific practical refinements required for robust 3D image acquisition, which we have not previously reported.

The optics used in our system are shown in Fig. 2. The optical design of the laser illumination is based on that described in [16]. The laser is incident vertically onto a resonant mirror (RM) which is conjugated to the image plane of the illumination objective (Nikon CFI Plan Fluor 10×W). By activating this resonant mirror it is possible to rapidly modulate the direction of propagation of the light in the illumination sheet, greatly reducing shadowing effects that would otherwise be visible [16]. The other mirror (M) is conjugated to the pupil plane of the illumination objective and can be used to adjust the z coordinate of the light sheet in order to bring it into focus for imaging. For detailed discussion of the design of SPIM illumination optics, see [15, 16].

 figure: Fig. 2

Fig. 2 Optical configuration of the microscope. A free-running camera (CCD1, Prosilica GS650) acquires brightfield images continuously for real-time analysis, and a second fluorescence camera (CCD2, QImaging QIClick) is triggered to acquire gated frames only at the appropriate calculated times. As the sample is moved to change the focal depth for sectioned fluorescence imaging, the f =150 mm tube lens is moved proportionally in order to keep CCD1 focused at a constant depth and provide a fixed reference signal.

Download Full Size | PDF

The imaging optics are based around a Nikon CFI75 16×W water dipping lens. A dichroic mirror is used to separate red transmitted light (incident onto CCD1) from green fluorescence emission light (incident onto CCD2). A motorized tube lens is used in conjunction with CCD1, for reasons that will be discussed later.

2.1. Drift correction

In a zebrafish embryo mounted for imaging there are two main timescales of motion. The first is the heartbeat, with a period of approximately 300 ms. It is this periodic motion that our synchronization system is designed to correct for. There will also be undesirable motion on much slower timescales, which may occur due to slight shifting of the sample within the chamber, as an artefact of the z scanning process (as discussed in detail in the next section), or ultimately even due to growth of the embryo..

In order to maintain good synchronization over a timescale of several minutes, it is important to correct for this translational motion of the sample in the xy plane. We find that the SAD metric described earlier provides good discrimination in the face of a few pixels of drift, but larger translational shifts will reduce the global similarity and hence degrade the performance of the algorithm.

In order to perform this correction our algorithm maintains an integer coordinate offset (dx,dy) to be applied to frame f prior to comparison with one of the reference frames, in order to ensure that the two images are correctly co-registered. When synchronization begins this offset is initialized to (0,0), in other words no drift correction is initially applied. This coordinate offset is used when calculating the metric ∂fg via pixel-by-pixel comparisons between frame f and each reference frame g.

Other than this coordinate offset, the phase recovery and forward prediction steps on receipt of a frame f are performed in exactly the same way as we previously described in [14]. Immediately after this processing the coordinate offset is updated as follows. Frame f is compared against the reference frame it was found to be most similar to, using five different candidate coordinate shifts: (dx,dy) and the four nearest-neighbor shifts (dx + 1,dy), (dx − 1,dy), (dx,dy + 1) and (dx,dy − 1). The shift that results in the smallest value of ∂fg is used as the shift (dx′,dy′) that will be applied to the next received frame.

In this way the coordinate offset is continuously updated. Figure 3(b) shows an example plot of (dx,dy) over time for a video of an embryonic zebrafish heart. The embryo was deliberately immobilised rather poorly, and in addition we introduced a manual reset of the software’s stored coordinate offset to (dx,dy) = (0,0) at about 5.5 seconds. This serves to demonstrate the stability of the drift correction algorithm: the fact that the offset recovers rapidly to the value prior to the reset illustrates the robustness of this simple nearest-neighbor search in the face of abrupt changes in sample position. The algorithm will accurately track motion of up to one pixel per frame, and this test confirms that it will successfully recover should more rapid motion unexpectedly occur (with the correction converging at a rate of one pixel per frame).

 figure: Fig. 3

Fig. 3 (a) Ray diagram used in magnification calculation. Two ray paths are shown, for an object at the origin (black rays) and at a shifted location (green rays). (b) Corrective translation (in pixels) determined by our algorithm, showing recovery following manual reset to illustrate robustness of algorithm. Note that the sample was deliberately mounted poorly to induce a faster than normal drift, and the offset was also manually reset at around 5.5 seconds in order to demonstrate the recovery of the software from a large and instantaneous perturbation. (c) Magnification M as a function of tube lens offset (Δz) for our experimental configuration (where the tube lens focal length f2 = 125 mm and the objective focal length f1 = 12.5 mm) showing, for three candidate values of D0, the change in magnification over a substantial range of 200 μm of focal depth.

Download Full Size | PDF

Although this algorithm does not address sample motion in z, or rotation and other more complex deformations, the main source of motion in our mounting system is gravity, which acts along the y axis. Thus minor motions of a few pixels due to gravity are corrected. For simplicity we have initially introduced this concept of drift correction in the concept of sample motion, but in the next section we will see that it has a second, more important purpose in the context of image translation introduced during focus correction.

2.2. Focus correction

A significant challenge during acquisition of a 3D dataset stems from the fact that a dataset is acquired by scanning the sample through the (non-moving) light sheet. This ensures that the imaging objective (and hence the fluorescence camera) remains focused on the plane of the light sheet as the sample is scanned. However this means that the focal plane of the brightfield camera within the sample will also change unless specific remedial action is taken. Although the synchronization algorithm can tolerate changes in focus of the order of 50 μm in our particular optical configuration, the brightfield focus should ideally be maintained at a constant depth in the sample – especially when acquiring 3D datasets over several hundred μm – in order to maintain good synchronization. We achieve this using a motorized tube lens as shown in Figs. 2 and 3(a). This maintains a constant focus, but introduces a change in magnification and a translational shift in the brightfield images.

The ratio between the distance Δz the sample is moved and the compensating distance Δz′ that the tube lens must be moved can be calculated using ray optics and the lensmaker’s equation, as shown in Fig. 3(a), expanding for small Δz:

1f1=nf1Δz/n1vzvz=nf12Δz+O(Δz2)
1f2=1f2+Δz+1vz+D0ΔzR=ΔzΔz=f22nf12+O(Δz).
where n is the refractive index of water and O is Landau’s symbol; see Fig. 3 for definitions of other quantities. We can calculate the change in magnification M as follows, using equivalent notation for a lateral object displacement Δy:
vy=Δyvzf1vy=Δynf1Δz+O(Δz2)
Δyvy=f2+Δzvz+D0ΔzM=ΔyΔy=f2f1(1+f2D0nf12Δz)+O(Δz2).

Thus by ensuring the distance D0 is equal to f2 we can eliminate changes in magnification for small Δz, leaving just a residual translational shift of the images. The correction for sample drift described in the previous section will automatically correct for this minor shift: for this purpose a true drift in sample position is indistinguishable from the apparent motion due to adjustment of the tube lens.

We have calculated this condition for small Δz, but in practice we require effective correction for larger focal shifts of the order of 100 μm. Figure 3(c) plots the magnification change as a function of Δz for three different distances D0. The optimum for small Δz is D0 = f2 = 125 mm, as noted above, and the other values heuristically reduce the variation in magnification over complete scan distances of 0–100 and 0–200 μm respectively. However, it should be noted that a change in magnification factor as high as 0.06 still only results in a shift of 1 pixel for features at the edge of a 250×250 pixel image such as that used in our system. We selected a distance D0 = 130 mm for our optical design in order to optimize for a scan range of 0–100 μm.

2.3. Gated 3D image acquisition

In preparation for acquiring a dataset a zebrafish embryo, lightly anaesthetised using Tricaine methanesulfonate, was drawn into a length of fluorinated ethylene propylene (FEP) tubing, and placed in the imaging chamber. The microscope was adjusted such that a focused image of the heart was visible on the brightfield camera, which had a region of interest (RoI) applied to crop the output image to the size of the heart and was acquiring images at 80 fps. At the same time the microscope was arranged so that the fluorescence camera was acquiring sectioned images at a plane at the very top of the heart. The synchronization software was then activated. This initially identifies the period of the heartbeat and acquires one complete heartbeat’s worth of reference brightfield images; all subsequent images are then compared against this dataset during the phase recovery process.

To acquire the 3D image stack, motor-driven translation stages moved the sample at a constant speed vz, and the tube lens at a constant speed R × vz. We note that our use of a z-invariant synchronization signal means that temporal registration of 2D slices from different z coordinates is trivial. This eliminates any risk of accumulated errors leading to an erroneous 3D reconstruction that does not represent a true snapshot at a fixed time point, an important issue highlighted in [13].

3. Results and discussion

Figure 4 shows raw data obtained using our synchronization system. The left-hand images in Fig. 4(a) show the continuous sequence of brightfield images acquired at 80 fps. These were analyzed in real-time using our algorithm, which was used to generate trigger signals for the fluorescence camera (images shown on the right). One image is acquired per heartbeat, at a fixed user-specified point in the heart cycle. The sample was translated at a constant velocity of 1.2 μm/s in order to acquire a series of z-sections at intervals of approximately 0.65 μm. A fixed focus was maintained for the brightfield images by translating the corrective tube lens for that camera at a rate of 16 μm/s (as determined by Eq. (2)). The 160 z-sections in the complete dataset were acquired at a rate of one per heartbeat, implying a total acquisition time of approximately 1.5 minutes. This acquisition time is directly determined by the desired number of z-sections, which depends on the size of the feature being imaged and the required z resolution. The total exposure of each part of the sample to excitation laser light, even allowing for oversampling in z, was only around 0.1 μJ/μm2. Figure 5 and the associated movie illustrate some details of the analysis pipeline which determines when the trigger signals are sent to the fluorescence camera.

 figure: Fig. 4

Fig. 4 (a) Raw data from brightfield and fluorescence cameras (scale bar 20 μm). The fluorescence camera is only triggered once per heartbeat, taking a single optically sectioned image slice on each heartbeat, scanning down through the ventricle of the heart. (b) Selected fluorescence image slices acquired as part of the image stack (see also Media 1).

Download Full Size | PDF

 figure: Fig. 5

Fig. 5 Still frame from Media 2 illustrating some aspects of the operation of the synchronization algorithm. The movie shows a 100 ms time window around the trigger firing time, slowed down by a factor of 100. The data flow is highlighted in the system diagram as successive brightfield frames are received. Phase recovery and forward prediction is performed for every frame. When the forward prediction indicates that a trigger signal will soon be required (within the next 40 ms), the timing controller is programmed with the required information. This then generates the electrical signal to trigger acquisition of the fluorescence image. At the bottom of the movie two event timelines are shown. The upper timeline shows the brightfield frames being exposed (black) and processed in software (gray). The lower timeline shows the results of the forward prediction analysis (red circle showing predicted time to trigger fluorescence camera). This prediction is improved every time a new brightfield image is received; the marker changes to blue when the algorithm “commits” to a time and programs it into the timing controller, and changes to black when the electrical trigger is actually sent to the fluorescence camera.

Download Full Size | PDF

Figure 6 shows this same volumetric dataset reconstructed in false color to visualize the complex internal structure of the ventricle. The only processing performed on the raw data was a linear rescaling of pixel intensity as a function of image y coordinate in order to correct for intensity variation along the vertical axis of the light sheet. The image stack was then rendered in false color using the VolView software package to show an intensity isosurface representing the boundary of the cardiac myocyte tissue labelled with green fluorescent protein (transgenic strain cmlc2:gfp). Part of the dataset was masked out in order to show a “cutaway view” of the heart so that the internal structure can be seen, resulting in the images shown here. The colors used are purely intended as aids to visualization, and do not have biological meaning. Finally, the same raw data is shown again in Fig. 7, sliced along xy and yz planes to illustrate the quality of the synchronization.

 figure: Fig. 6

Fig. 6 Left: reconstructed false color image showing a cutaway of a 3D volumetric reconstruction of the ventricle at approximately 5 days after fertilization. The dataset from Fig. 4 was reconstructed in the VolView software package, using intensity-based segmentation to identify the cardiac myocyte tissue labelled with green fluorescent protein (transgenic strain cmlc2:gfp). A cutaway showing the structure of the trabeculae on the inside wall of the ventricle was then rendered in false colour. The ventricle long axis is approximately 140 μm across (see also Media 3). Right: a second reconstruction of a ventricle earlier in the development process, at approximately 4 days after fertilization. Both datasets were taken with the ventricle at end-systole.

Download Full Size | PDF

 figure: Fig. 7

Fig. 7 Image for a single xy plane (left) and two reconstructed slices in the yz plane (center and right) taken from the dataset shown in Figs. 4 and 6 to allow the consistency of the synchronization to be judged visually. The vertical lines indicate the positions where the various perpendicular slices shown intersect each other. Our refocusing scheme maintains a fixed brightfield reference focus, eliminating registration artefacts and systematic bias that might be present in post-processed datasets (see [11, 13]); any residual artefacts in the yz reconstructions are due to slight inaccuracies in the timing of the trigger signals. We note that the spatial resolution of the images degrades gradually with increasing z and x due to scattering and aberrations caused by imaging deeper inside the sample.

Download Full Size | PDF

The lateral resolution of our SPIM microscope is ultimately limited by the numerical aperture of the imaging objective to around 0.5 μm. We observe peak wall velocities in the embryonic zebrafish heart to be around 160 μm/s. From this we can calculate that a temporal accuracy of 2 ms is sufficient to ensure that the resultant position errors are smaller than the lateral resolution, an accuracy that we have shown our system can surpass [14]. For comparison, an approach based on post-acquisition analysis as opposed to our real-time synchronization system would require approximately 150 fluorescence images to be acquired at every z coordinate in order to be able to achieve the same temporal accuracy.

This condition is sufficient to ensure that, for our chosen imaging modality, our synchronization scheme is able to acquire a 3D dataset with positional resolution equivalent to that which would be achieved were the heart hypothetically to be “frozen in time” for scanned 3D imaging. We note that it would in fact be extremely challenging to acquire an equivalent dataset in a fixed specimen: in a dead organism the heart will tend to collapse into a shape different to its natural shape when beating, reducing the biological usefulness of the images for functional analysis. In the (larger) chick heart polymer perfusion has been shown to be a viable method for micro-computed tomographic imaging of the fixed heart at end-systole without shrinkage; in the zebrafish it is currently only through the use of gating techniques that meaningful images like the ones we present here can be obtained.

The temporal accuracy will depend on the point in the heart cycle that the system is synchronized to. We find it to be particularly challenging to synchronize to phases at the very start of atrial or ventricular contraction. This is partly due to the slower motion of image features at these phases in the cycle, which reduces the contrast in the signal we are measuring. However it is also due to genuine variability in the heart’s rhythm, which we find is often manifest as variable PR or ST intervals. This makes it harder to predict exactly where the heart will be even as little as 10–20 ms into the future, as required by the latency in our system [14]. Lee et al. have remarked on similar effects when imaging mouse hearts [12].

The results presented in this paper show a high resolution 3D image at a single user-selected point in the heart cycle. We are currently working on extending our approach to acquire a movie of the complete heart cycle (a “4D” dataset [9]). In principle this simply requires us to define n equally-spaced phases in the heart cycle at which we require trigger signals to be sent, in order to acquire a n-frame movie, instead of the one single phase we currently synchronize to. In practice prediction difficulties at certain phases, as described in the previous paragraph, presents challenges that will need to be overcome with further refinements to the algorithm. Clearly the use of n separate imaging phases increases the laser exposure by a factor of n, whereas post-processing approaches can extract frames at any number of points in the cycle at no additional cost. However many applications only require a relatively small number of phases n to be acquired, for example n = 1 to 20, meaning that our method still offers significant gains in terms of phototoxicity and acquisition time.

4. Conclusions

We have shown that we can obtain high resolution 3D images of the beating heart through the use of real-time non-invasive gating. The ability to acquire high quality 3D datasets of a normally-beating zebrafish heart within a timescale of the order of one minute opens up the possibility of performing studies into organ development and injury response, imaging a single specimen over an extended period of time in a minimally-invasive manner. The high resolution of the resultant images also makes the technique suitable for sub-cellular imaging. For good synchronization it is important to have a reference signal that is unchanged while scanning through the 3D imaging volume. We achieve this by means of a focus correction system which maintains a fixed focus for brightfield imaging while the focus for fluorescence imaging is scanned through the sample.

Each point in the volume of the sample is only exposed once during acquisition of an entire 3D dataset, thanks to the combination of SPIM and our real-time optical gating; this hugely reduces the photobleaching and phototoxic stress on the sample, without the need for high speed high sensitivity cameras. Our technique also has a number of wider applications beyond fluorescence imaging: we have demonstrated the use of the technique for synchronized laser intervention within the beating heart as part of tissue healing studies (manuscript in preparation), and optogenetic experiments are equally possible. Furthermore, we believe that more detailed analysis of the heart motion data obtained in the course of the phase recovery algorithm will provide insights into the detailed physiology of the heart, with relevance for example to studies into the impact of specific pharmacological drugs on the heart.

Overall, our robust technique for real-time optical gating in the beating heart has opened up the ability to perform a wide range of optical imaging and intervention procedures on the beating zebrafish heart exactly as if the heart had been artificially stopped, but sidestepping this undesirable interference with the heart and instead allowing the heart to beat as normal.

Acknowledgments

We acknowledge funding from: the Engineering and Physical Science Research Council; the British Heart Foundation, in particular a Research Excellence Award; and Durham and Edinburgh Universities. We thank John Mullins, Kim de Mora, Sebastian Pieperhoff and Gianfranco Matrone at Edinburgh University for supplying samples for imaging. Animals were maintained according to the Animals (Scientific Procedures) Act 1986, United Kingdom.

References and links

1. A. S. Forouhar, M. Liebling, A. Hickerson, A. Nasiraei-Moghaddam, H.-J. Tsai, J. R. Hove, S. E. Fraser, M. E. Dickinson, and M. Gharib, “The embryonic vertebrate heart tube is a dynamic suction pump,” Science 312, 751–753 (2006). [CrossRef]   [PubMed]  

2. P. J. Scherz, J. Huisken, P. Sahai-Hernandez, and D. Y. R. Stainier, “High-speed imaging of developing heart valves reveals interplay of morphogenesis and function,” Development (Cambridge, England) 135, 1179–1187 (2008). [CrossRef]  

3. C.-J. Huang, C.-T. Tu, C.-D. Hsiao, F.-J. Hsieh, and H.-J. Tsai, “Germ-line transmission of a myocardium-specific GFP transgene reveals critical regulatory elements in the cardiac myosin light chain 2 promoter of zebrafish,” Dev. Dyn. 228, 30–40 (2003). [CrossRef]   [PubMed]  

4. J. Huisken and D. Y. R. Stainier, “Selective plane illumination microscopy techniques in developmental biology,” Development (Cambridge, England) 136, 1963–1975 (2009). [CrossRef]  

5. J. Swoger, M. Muzzopappa, H. López-Schier, and J. Sharpe, “4D retrospective lineage tracing using SPIM for zebrafish organogenesis studies,” J. Biophotonics 4, 122–134 (2011). [CrossRef]  

6. A. C. S. Brau, C. T. Wheeler, L. W. Hedlund, and G. A. Johnson, “Fiber-optic stethoscope: a cardiac monitoring and gating system for magnetic resonance microscopy,” Magn. Reson. Med. 47, 314–321 (2002). [CrossRef]   [PubMed]  

7. B. Hiba, N. Richard, H. Thibault, and M. Janier, “Cardiac and respiratory self-gated cine MRI in the mouse: comparison between radial and rectilinear techniques at 7T,” Magn. Reson. Med. 58, 745–753 (2007). [CrossRef]   [PubMed]  

8. M. Buehrer, J. Curcic, P. Boesiger, and S. Kozerke, “Prospective self-gating for simultaneous compensation of cardiac and respiratory motion,” Magn. Reson. Med. 60, 683–690 (2008). [CrossRef]   [PubMed]  

9. M. Liebling, A. S. Forouhar, M. Gharib, S. E. Fraser, and M. E. Dickinson, “Four-dimensional cardiac imaging in living embryos via postacquisition synchronization of nongated slice sequences,” J. Biomed. Opt. 10, 054001 (2005). [CrossRef]   [PubMed]  

10. J. J. Schoenebeck and D. Yelon, “Illuminating cardiac development: Advances in imaging add new dimensions to the utility of zebrafish genetics,” Semin. Cell Dev. Biol. 18, 27–35 (2007). [CrossRef]   [PubMed]  

11. J. Vermot, S. E. Fraser, and M. Liebling, “Fast fluorescence microscopy for imaging the dynamics of embryonic development,” HFSP J. 2, 143–55 (2008). [CrossRef]  

12. S. Lee, C. Vinegoni, P. F. Feruglio, L. Fexon, R. Gorbatov, M. Pivoravov, A. Sbarbati, M. Nahrendorf, and R. Weissleder, “Real-time in vivo imaging of the beating mouse heart at microscopic resolution,” Nat. Commun. 3, 1054 (2012). [CrossRef]   [PubMed]  

13. I. V. Larina, K. V. Larin, M. E. Dickinson, and M. Liebling, “Sequential turning acquisition and reconstruction (STAR) method for four-dimensional imaging of cyclically moving structures,” Biomed. Opt. Express 3, 650–660 (2012). [CrossRef]   [PubMed]  

14. J. M. Taylor, C. D. Saunter, G. D. Love, J. M. Girkin, D. J. Henderson, and B. Chaudhry, “Real-time optical gating for three-dimensional beating heart imaging,” J. Biomed. Opt. 16, 116021 (2011). [CrossRef]   [PubMed]  

15. K. Greger, J. Swoger, and E. H. K. Stelzer, “Basic building units and properties of a fluorescence single plane illumination microscope,” Rev. Sci. Instrum. 78, 023705 (2007). [CrossRef]   [PubMed]  

16. J. Huisken and D. Y. R. Stainier, “Even fluorescence excitation by multidirectional selective plane illumination microscopy (mSPIM),” Opt. Lett. 32, 2608–2610 (2007). [CrossRef]   [PubMed]  

Supplementary Material (3)

Media 1: MOV (2943 KB)     
Media 2: MOV (746 KB)     
Media 3: MOV (3091 KB)     

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1
Fig. 1 Schematic diagram of synchronization system. Brightfield images are compared against a set of previously-acquired reference frames, and a phase assigned to each received image (“phase recovery”). A fit is then performed on this data in order to predict the time at which the heart will be in the particular desired position (“forward prediction”). This predicted time is transmitted to the timing controller which generates electrical trigger signals for the fluorescence camera and the excitation laser.
Fig. 2
Fig. 2 Optical configuration of the microscope. A free-running camera (CCD1, Prosilica GS650) acquires brightfield images continuously for real-time analysis, and a second fluorescence camera (CCD2, QImaging QIClick) is triggered to acquire gated frames only at the appropriate calculated times. As the sample is moved to change the focal depth for sectioned fluorescence imaging, the f =150 mm tube lens is moved proportionally in order to keep CCD1 focused at a constant depth and provide a fixed reference signal.
Fig. 3
Fig. 3 (a) Ray diagram used in magnification calculation. Two ray paths are shown, for an object at the origin (black rays) and at a shifted location (green rays). (b) Corrective translation (in pixels) determined by our algorithm, showing recovery following manual reset to illustrate robustness of algorithm. Note that the sample was deliberately mounted poorly to induce a faster than normal drift, and the offset was also manually reset at around 5.5 seconds in order to demonstrate the recovery of the software from a large and instantaneous perturbation. (c) Magnification M as a function of tube lens offset (Δz) for our experimental configuration (where the tube lens focal length f2 = 125 mm and the objective focal length f1 = 12.5 mm) showing, for three candidate values of D0, the change in magnification over a substantial range of 200 μm of focal depth.
Fig. 4
Fig. 4 (a) Raw data from brightfield and fluorescence cameras (scale bar 20 μm). The fluorescence camera is only triggered once per heartbeat, taking a single optically sectioned image slice on each heartbeat, scanning down through the ventricle of the heart. (b) Selected fluorescence image slices acquired as part of the image stack (see also Media 1).
Fig. 5
Fig. 5 Still frame from Media 2 illustrating some aspects of the operation of the synchronization algorithm. The movie shows a 100 ms time window around the trigger firing time, slowed down by a factor of 100. The data flow is highlighted in the system diagram as successive brightfield frames are received. Phase recovery and forward prediction is performed for every frame. When the forward prediction indicates that a trigger signal will soon be required (within the next 40 ms), the timing controller is programmed with the required information. This then generates the electrical signal to trigger acquisition of the fluorescence image. At the bottom of the movie two event timelines are shown. The upper timeline shows the brightfield frames being exposed (black) and processed in software (gray). The lower timeline shows the results of the forward prediction analysis (red circle showing predicted time to trigger fluorescence camera). This prediction is improved every time a new brightfield image is received; the marker changes to blue when the algorithm “commits” to a time and programs it into the timing controller, and changes to black when the electrical trigger is actually sent to the fluorescence camera.
Fig. 6
Fig. 6 Left: reconstructed false color image showing a cutaway of a 3D volumetric reconstruction of the ventricle at approximately 5 days after fertilization. The dataset from Fig. 4 was reconstructed in the VolView software package, using intensity-based segmentation to identify the cardiac myocyte tissue labelled with green fluorescent protein (transgenic strain cmlc2:gfp). A cutaway showing the structure of the trabeculae on the inside wall of the ventricle was then rendered in false colour. The ventricle long axis is approximately 140 μm across (see also Media 3). Right: a second reconstruction of a ventricle earlier in the development process, at approximately 4 days after fertilization. Both datasets were taken with the ventricle at end-systole.
Fig. 7
Fig. 7 Image for a single xy plane (left) and two reconstructed slices in the yz plane (center and right) taken from the dataset shown in Figs. 4 and 6 to allow the consistency of the synchronization to be judged visually. The vertical lines indicate the positions where the various perpendicular slices shown intersect each other. Our refocusing scheme maintains a fixed brightfield reference focus, eliminating registration artefacts and systematic bias that might be present in post-processed datasets (see [11, 13]); any residual artefacts in the yz reconstructions are due to slight inaccuracies in the timing of the trigger signals. We note that the spatial resolution of the images degrades gradually with increasing z and x due to scattering and aberrations caused by imaging deeper inside the sample.

Equations (4)

Equations on this page are rendered with MathJax. Learn more.

1 f 1 = n f 1 Δ z / n 1 v z v z = n f 1 2 Δ z + O ( Δ z 2 )
1 f 2 = 1 f 2 + Δ z + 1 v z + D 0 Δ z R = Δ z Δ z = f 2 2 n f 1 2 + O ( Δ z ) .
v y = Δ y v z f 1 v y = Δ y n f 1 Δ z + O ( Δ z 2 )
Δ y v y = f 2 + Δ z v z + D 0 Δ z M = Δ y Δ y = f 2 f 1 ( 1 + f 2 D 0 n f 1 2 Δ z ) + O ( Δ z 2 ) .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.