Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Synchronized strobed phase contrast and fluorescence microscopy: the interlaced standard reimagined

Open Access Open Access

Abstract

We propose a simple, cost-effective method for synchronized phase contrast and fluorescence video acquisition in live samples. Counter-phased pulses of phase contrast illumination and fluorescence excitation light are synchronized with the exposure of the two fields of an interlaced camera sensor. This results in a video sequence in which each frame contains both exposure modes, each in half of its pixels. The method allows real-time acquisition and display of synchronized and spatially aligned phase contrast and fluorescence image sequences that can be separated by de-interlacing in two independent videos. The method can be implemented on any fluorescence microscope with a camera port without needing to modify the optical path.

© 2023 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Fluorescence microscopy is widely used to localize or co-localize [1,2] fluorescently labeled molecules or molecular assemblies in living cells. In any implementation, fluorescence imaging is often used in conjunction with a form of transmitted light microscopy, such as phase contrast, that provide spatial context by visualizing non-labeled structures. This is crucial when assessing the relative movements of fluorescently labeled and non-labeled structures in cells in studies of cell motility, intracellular transport, and so on. To do this, both imaging capabilities must be implemented on one microscope with shared optical paths.

A typical phase contrast/fluorescence setup is sketched in Fig. 1(a). A phase contrast light source, typically a tungsten light bulb or a light emitting diode (LED) illuminates the sample. Transmitted light is collected by the objective and directed towards the eyepieces or towards a camera port for image acquisition, typically by sliding a mirror (M) in or out of the optical path. For fluorescence microscopy, excitation light (typically from a mercury or xenon arc lamp, a high-power LED or a laser) is reflected by a long-pass dichroic mirror (D) and directed through the objective towards the sample to excite fluorophores. Fluorescence emission of longer wavelengths is collected by the objective, passes through the long-pass dichroic mirror D, and is then directed towards the eyepieces or the camera port. Excitation and emission filters are used to select the proper absorption and emission wavelengths for the particular fluorophore. Ingenious dichroic-free excitation methods [3] have been proposed. Shutters are used to selectively block the phase contrast illumination and the fluorescence excitation light, respectively (S1 and S2 in Fig. 1(a)).

 figure: Fig. 1.

Fig. 1. (a) Simplified phase contrast/fluorescence inverted microscope. S1: transmission light shutter; S2: fluorescence excitation shutter; Ex: excitation filter; D: long pass dichroic mirror; Em: emission filter; M: mirror. The phase rings and the condenser for phase illumination are omitted, for simplicity. We chose to represent an inverted microscope, more popular in biological laboratories; upright microscopes are upside down but the optical paths are similar. (b) Phase contrast/fluorescence imaging using two dedicated cameras. A long pass filter (LP) placed between the phase contrast light source and the sample selects wavelengths longer than the cut-on wavelength of the long-pass dichroic mirror D1 and which also do not overlap with the emission spectrum of the fluorophore. A secondary long pass dichroic mirror (D2) reflects the fluorescence emission light towards the fluorescence camera port and transmits the phase contrast light towards the phase contrast camera. (c) Single camera strobed interlaced phase contrast/fluorescence imaging. The phase and fluorescence light sources are strobed in counter-phase with each other and in phase with the corresponding field exposures of an interlaced camera. The strobe pulses are triggered by the vertical field synchronization (Vsync) pulses extracted from the video signal at the output of the camera at the time of acquisition (see text for details).

Download Full Size | PDF

It is straightforward to obtain static images of fixed or slow-changing live samples, by switching between the two microscopy modes (by closing and opening the appropriate shutters) and manually triggering the camera acquisition. Investigating dynamic processes that occur at fast time scales in living specimens requires simultaneous, spatially aligned and synchronized video acquisition in both microscopy modes. This can be achieved in different ways.

Single camera, simultaneous imaging. In the basic setup in Fig. 1(a), simultaneous phase contrast and fluorescence video imaging can be achieved by opening shutters S1 and S2 at the same time. The intensity of the two light sources is empirically adjusted to create the desired levels of transmitted and emitted light within the dynamic range of the camera. This strategy provides synchronized and aligned images between the two modes. The major limitation is that the two imaging modes cannot be reliably separated after acquisition. The high contrast advantage of fluorescence microscopy is diminished by its overlap with a phase contrast image. The qualitative assessment of fluorophore distribution and localization is impaired. The wide spectrum of the phase contrast light source, narrowed by the dichroic mirror D and the emission filter Em, overlaps with the spectrum of fluorescence emission, making it hard to discern what is fluorescently labeled and what is a phase contrast representation of non-labeled structures, especially on monochrome cameras that are preferred for their higher low-light sensitivity. The fluorescence emission quantification is less accurate. Nevertheless, this cost-effective method is widely used, and allowed successful manual tracking of non-labeled E. coli cell bodies with fluorescently labeled flagellar filaments in bacterial swarms [4].

Two cameras, simultaneous imaging. Another approach is using two-cameras for combined phase contrast/fluorescence imaging, depicted in Fig. 1(b). This method requires using a spectrum for phase contrast microscopy that does not overlap with the fluorescence emission. For experiments involving fluorescence imaging in living cells, red or near-infrared light is preferred for phase illumination, as the blue end of the visible spectrum induces unwanted physiological responses [57]. One advantage of this approach is that separate camera acquisitions can be optimized independently for each microscopy mode. Independent, high-contrast fluorescence and phase contrast images are obtained. This approach is considerably costlier. Other than an extra camera, it requires a more expensive microscope frame with an extra camera port and two extra optical filters (in the geometry depicted in Fig. 1(b), a long-pass phase illumination filter LP and a long-pass dichroic D2 with cut-on wavelengths longer than the right tail of the emission spectrum of the fluorophore to be observed). Optical alignment is more complicated. The two cameras (and the image in the eyepieces) have to be made parafocal. The two fields of view (which depend on the physical size of the camera sensors) have to overlap significantly, which might require extension tubes and/or extra relay lenses and adjustable mounts. A synchronization method has to be implemented to allow synergic operation (simultaneous acquisition or time stamping). The usage is less ergonomic, with the operator having to observe two non-overlaid video sequences on two different monitors (or windows) simultaneously. Image analysis can be complicated by comparing images from cameras with different sensor sizes or pixel densities. In order obtain overlapping and perfectly aligned fields of view, one of them might have to be resampled or scaled, rotated and translated to compensate for the inherent misalignment of the two cameras. Nevertheless, this is the method of choice when quantitative fluorescence microscopy is desired in experiments that require simultaneous and rapid phase contrast microscopy. Such a setup was used to quantify the binding of fluorescently labeled signaling CheY-P molecules to the flagellar motor as a function of its direction of rotation in tethered E. coli cells, assessed in phase contrast mode [8].

2. Method

Single camera, strobed interlaced phase/fluorescence imaging. We propose a simple and cost-effective method that allows simultaneous, synchronized and independent phase contrast and fluorescence video sequence acquisition. To do this, we combined interlaced video acquisition with pulsed illumination/excitation (Fig. 1(c)). Our technique can be used on any fluorescence microscope with a camera port, does not require alterations of the optical paths and is relatively inexpensive.

The method is inspired by the inherent property of the interlaced video standard to carry out two successive exposures for two subsets of pixels (“fields”) in each video frame they acquire: one for the odd and another one for the even horizontal lines of pixels in an image. In its intended application (analog television [TV] transmission that was standardized in 1940s), the two fields are combined (interlaced) to produce full frames of 680 × 480 or 1920 × 1080 pixels for standard or high-definition (in the parlance of our times) TV, respectively, at a fixed “video” rate of about 30 frames/s (fps), corresponding to the acquisition of 60 interlaced fields/s. In ordinary video, combining two interlaced fields results in smooth transition between frames. Interlacing can also create artifacts if the content of the frame changes significantly between the two field exposures (as exemplified in Fig. 2(a)). See Supplement 1 for a more thorough description of the interlaced standard and its evolution.

 figure: Fig. 2.

Fig. 2. Interlacing artifacts in a phase contrast-only acquisition, corrected by de-interlacing and interpolation. Fluid tracking particle (pointed by the white arrow) moving at a speed of about 90 µm/s by an E. coli cell (rod-shaped structure above the particle) attached to an agar substrate. (a) The odd and even fields (pseudo-colored in red and green, respectively) are exposed 1/60 s apart, time in which the particle traveled about 1.5 µm. The two fields are separated by de-interlacing (b) and (c) and the missing lines in each field reconstituted by interpolation (e and f). The overlapping of the de-interlaced and interpolated fields (d) shows the spherical particle as ovoid, with a green leading and red trailing edge (still frame from an interlaced video sequence published in [14]). (See also Visualization 1 and Visualization 2 for the original and processed video sequences).

Download Full Size | PDF

Here, we take advantage of the interlaced standard to enable two distinct exposures for each acquired frame. We perform phase contrast microscopy with one field and fluorescence microscopy with the other field in the same video frame. To do this, we strobe two light sources in counter-phase (an LED for phase contrast/transmitted light illumination and a laser or another LED for fluorescence excitation), in synchrony with the odd (field 1) and even (field 2) lines of pixels during acquisition. The result is a combined phase contrast/fluorescence video sequence in which each camera frame contains information for each microscopy mode in the corresponding field. The output can be visualized directly in real time as overlapping, aligned, synchronized phase contrast/fluorescence sequences (Figs. 3(a) and 4(a) on any composite-video capable monitor, without any further processing. Independent, phase-only (field 1) and fluorescence-only (field 2) sequences can by generated by de-interlacing. One can separate the two fields in two different image stacks and interpolate missing pixels (the even-line pixels in the odd field and odd-line pixels in the even field – Figs. 3 and 4, panels (c) and (d); Fig. 2 depicts the de-interlacing process for a phase contrast-only acquisition).

 figure: Fig. 3.

Fig. 3. E. coli. with fluorescently labeled filaments swimming close to an agar surface, in a thin layer of fluid. Single frame from a combined interlaced phase contrast/fluorescence video acquisition: as acquired (a), de-interlaced and interpolated to separate fluorescence- (c) and phase-only (d) sequences, pseudo-colored in green (filaments in fluorescence) and red (cell body in phase contrast) and overlapped for enhanced co-localization (b). Phenomena such as cell body wobbling (as a result of counter-rotation with respect to the filaments), smooth curved swimming (as a result of lateral rolling of the counter-rotating cell body when close to the surface) or conformational changes of the filaments in the proximity of the surface can be observed. See also Visualization 3 and Visualization 4 for the entire original and processed video sequences.

Download Full Size | PDF

 figure: Fig. 4.

Fig. 4. E. coli cells with cell bodies stuck to a thin layer of agar, rotating their filaments against the substrate (single frame from a combined interlaced phase contrast/fluorescence video acquisition). The frames, as acquired by the camera (a), are de-interlaced and interpolated to generate fluorescence (c) and phase contrast-only (d) sequences for separate visualization and analysis. The spherical particles, visible in the phase contrast-only field can be tracked to probe the fluid flow around the cell bodies (the rod-shaped structures) caused by the rotation of the filaments, visible in the fluorescence-only field, as described elsewhere [14]. The two sequences are pseudo-colored (phase contrast in red, fluorescence in green) and superimposed for better co-localization (b). See also Visualization 5 and Visualization 6 for the entire original and processed video sequences and Visualization 7 and Visualization 8 for another example.

Download Full Size | PDF

A key element of this approach, the strobed (pulsed) illumination, is a powerful yet simple technique, generally used in photography to prevent the motion blur when capturing objects or events that move or change faster than the camera exposure time. A short pulse of light is used to illuminate (or excite) the sample during a longer camera exposure, resulting in an image with an effective exposure time shorter than the one set by the camera. Strobed excitation has been used to improve on the accuracy of position determination of fluorescent beads in optical traps [9], image fast rotating flagellar filaments [10] and probe rotation of molecular structures with a time resolution much higher than the video rate [11], by combining multiple strobed acquisitions with different delays from a synchronizing event. Pulsed fluorophore excitation has been reported to reduce the fluorophore bleaching rate relative to the equivalent continuous exposure [12]. It has been used to prevent the fluorophore bleaching from interfering with the inter-molecular homo-FRET efficiency and interpretation [13].

Phase contrast illumination and fluorescence excitation light sources are strobed in synchrony with the vertical synchronization (Vsync) pulses that precede the exposure of each corresponding field of an interline CCD sensor in an interlaced camera. Each pixel well on the sensor integrates photons during almost the entire duration of each field (∼16.7 ms, 1/60 s), which is half of the duration of a full frame (∼33.3 ms, 1/30 s). At the end of each field exposure, the pixel charge content is transferred to a storage area in a vertical shift register (a fast process that typically requires about 1 µs), to be read during the next field exposure. The charge content of the pixel wells is cleared shortly after the transfer and a new field exposure is commenced. The pixel wells start integrating photons again, while the content of the vertical storage area representing the previous field exposure is read line by line, horizontally (through a slower process). As each field is exposed while the previous one is being read, the video acquisition is almost continuous, with very short blackout periods between consecutive fields. In field integration mode, the two field exposures do not overlap at any time and can last up to ∼1/60 s each. Effectively shorter exposures can be obtained by pulsing (strobing) light at any time during each exposure. The pixel wells will still integrate over the entire 1/60 s duration of the field, but photons will only arrive when available, i.e. during the light pulse.

The phase contrast illumination (turned on during field 1) and the fluorescence excitation (turned on during field 2) are controlled independently, by adjusting the pulses duration and light intensity. As in any acquisition, images of similar exposure levels (i.e. total number of photon counts per exposure) can be captured using shorter effective exposure times of higher intensity (for fast events) or longer effective exposure times of lower intensity (for slower events). As the light pulses are controlled independently in each field, shorter phase contrast exposures can be combined with longer fluorescence exposures or vice versa, as desired (Fig. 5(d)).

 figure: Fig. 5.

Fig. 5. Strobed phase contrast/fluorescence excitation timing diagrams. Oscilloscope traces (top to bottom in each panel): yellow (trace 1): composite video signal, as output by the interlaced camera; green (trace 2): vertical synchronization pulses; red (trace 4) field 2 exposure pulses; blue (trace 3): field 1 exposure pulses. In each field, the light is turned ON when the signal levels are high. (a) Dual strobed phase contrast illumination/fluorescence excitation. Each light source is ON for 4 ms at the beginning of each field, at about 30 fps, with a time shift of ∼16.7 ms between sequences. Phase contrast-only (b) or fluorescence-only (c) acquisitions at twice the frame rate (∼60 fps). (d) Dual strobed phase contrast/fluorescence acquisition with different effective exposure times (2 ms phase contrast and 12 ms fluorescence). (e) Dual strobed phase contrast/fluorescence acquisition with minimized time shift between sequences (here, reduced to 676 µs). Field 1 pulses are labeled 1A, 1B and 1C, field 2 pulses labeled 2A and 2B. Pulse 2A is applied towards the end if its field, pulse 1A at its beginning (f). Time-enlarged representation of panel (e) in the vicinity of pulses 1A and 2A. See also Fig. S.3 in Supplement 1.

Download Full Size | PDF

Using a 2:1 interlacing/de-interlacing scheme allows obtaining two independent video sequences, each at the same frame rate as the full-frame capability of the camera. The sacrifice is half of the vertical spatial resolution in each image, which can be partially recovered by interpolation – as shown in Figs. 2, 3 and 4.

Independently controllable light sources can be arbitrarily assigned to one or both of the interlaced fields. Thus, phase-only or fluorescence-only sequences at twice the frame rate limit of the camera can also be obtained if only one light source is strobed in both fields and each field is de-interlaced and stacked over the previous one in a single sequence (see Fig. 5 for timing diagrams and Visualization 1 and Visualization 2). In an operating mode suitable for most workflows, a camera would acquire continuously in phase contrast-only mode using strobed phase illumination in both fields (Fig. 5(b)). The operator will adjust the focus, search through the sample using phase contrast microscopy, select an area of interest, and commence fluorescence acquisition at will. At this point, a combined phase contrast/fluorescence acquisition can be triggered by switching to alternating phase contrast illumination/fluorescence excitation (Fig. 5(a)) and starting a recording device (computer or digital video recorder). Fluorescence-only acquisitions at twice the frame rate (by strobing the fluorescence light source in both fields, as in Fig. 5(c)) or different phase/fluorescence ratios (e.g. 3/1, one fluorescence field acquired every 3 phase fields) are also options.

We point out a few considerations of practical importance.

  • 1. Although recorded in synchrony over the same time span, the phase contrast and the fluorescence sequences are not precisely simultaneous. Images are shifted with a field integration period (1/60 s, i.e. ∼16.7 ms), possibly causing co-localization artifacts. This can be alleviated by shifting the light pulses with respect to the preceding (triggering) Vsync pulse, so that the effective exposure is close to the end of its integration period in field 1 and close to its beginning, in field 2 (Fig. 5(e) and (f)). The shortest usable time shift between sequences ($\tau $) depends on the exposure parameters and the characteristics of the camera and sensor:
    $$\tau = \frac{{{t_1}}}{2} + {t_{be}} + {t_{bi}} + \frac{{{t_2}}}{2}$$
    where ${t_1}$ and ${t_2}$ are the effective exposure times for field 1 and field 2 (i.e. the duration of the light pulses) and ${t_{be}}$ and ${t_{bi}}$ are the vertical blanking times at the beginning and at the end of each field, which vary across cameras and can be determined experimentally. By doing so, we were able to reduce the time shift between the two sequences from ∼16.7 ms (in a typical non-delayed field exposure) to ∼0.7 ms.
  • 2. The Vsync pulses that precede each field exposure can be accessed in a few ways, depending on the particular camera. Every 2:1 interlaced camera with an analog output embeds these pulses in the composite video signal (CVBS) presented to the displaying/recording device (see Fig. 5). They can be extracted with a video sync separator circuit, which also provides a field 1/field 2 (odd/even) identifier pulse, and used to trigger the two light sources with controlled delay, duration and amplitude. This is the simplest and least expensive method, which should work on any interlaced capable camera. Some cameras present the Vsync pulses on dedicated output pins, which can be used directly, without a video sync separator circuit.
  • 3. The method works with most (if not all) 2:1 CCD analog “true” interlaced video cameras abundantly available on the new and used market. We have successfully used the method with a few modestly priced CCTV surveillance cameras (between ∼$\$$15 and ∼$\$$100, without or with casing, with C-mounts that allow connecting them to standard microscope ports). The method might not work with some CMOS (“pseudo”-interlaced) cameras, which acquire images in progressive mode (expose the entire frame at a time, as a single field) but output standard NTSC signals by separating the same exposure in two interlaced fields (for compatibility reasons), after acquisition. The method is more effective for monochrome CCD cameras that are often more sensitive (lacking colored filters that block specific wavelengths) and have better resolution (without Bayer [14] or similar interpolation schemes to generate RGB values at every pixel location). Color is irrelevant in single-type fluorophore fluorescence and live phase contrast imaging, the main application this method is targeted to.
  • 4. Some advanced CCD interlaced cameras can be set to operate in either field integration mode or frame integration mode, which differ by vertical resolution and integration times. In field integration mode (described above) each field output (representing the odd or even lines of pixels in the final image) integrates two adjacent rows of pixels. For example, the odd lines 3 and 5 in the image represent the vertical integration of the lines 2 + 3 and 4 + 5 of pixel wells on the sensors, respectively, whereas the even lines 2 and 4 integrate the lines 1 + 2 and 3 + 4. Integrating two vertically adjacent wells on the sensor for each pixel in the image increases the sensitivity (by counting twice as many photons for each displayed pixel) and decreases the vertical resolution; no actual light is lost in any field. Thus, the interpolation of the missing lines after de-interlacing can be regarded as an up-scaling of the vertical resolution rather than an actual interpolation of missing lines (the horizontal resolution is preserved). The maximum integration time in field integration mode cannot exceed the duration of one field (1/60 s). Cameras that do not have switchable field vs. frame integration modes generally operate in field integration mode and can be used for phase/fluorescence strobed acquisition directly.

    In frame integration mode, each field output represents only the corresponding line of wells on the sensor, without any vertical integration. For example, the pixels of the odd lines 3 and 5 on the image represent the wells along the lines 3 and 5 on the sensor and the even lines 2 and 4 on the image represent the pixel lines 2 and 4 on the sensor. Thus, a full frame has a higher vertical resolution in frame integration mode than in field integration mode. The maximum integration time for each filed in frame integration mode is twice as long, about one frame (1/30 s), with a 1/60 s offset and a 1/60 s overlapping between consecutive fields (see Figs. S3 and S4 in Supplement 1 for a more detailed description and representation). This mode can be used for dual phase/fluorescence strobed acquisition as described here only if the field integration time (a parameter set in the camera, see next paragraph) is reduced below 1/30 s and the illumination/excitation pulses are applied outside of the overlapping integration periods.

    In both integration modes, the field integration time can be adjusted by delaying the start of the integration period in each field. This is accomplished internally, by constantly draining the content of the well until the start of the desired integration period (“electronic shuttering”). The integration time can be adjusted through external switches on the camera or software menus. The end of the integration period coincides with the end of the field, when the content of the well is transferred to the vertical interline storage buffer and cannot be adjusted.

  • 5. Single color LEDs that overlap significantly with the emission spectrum of the fluorophore (or with the long pass dichroic mirror transmission band) are preferred to white LEDs for phase illumination, both for their higher efficiency and their lower rise/decay times. In our tests, we used low or moderate power LEDs (an orange 589 nm OVLGY0C9B9 LED, TT Electronics/Optek Technology and a red 660 nm LZ4-40R208 LED, LED Engin) for strobed phase illumination and an Argon ion laser with a fast custom shutter (described elsewhere [11,13,15]) for strobed fluorescence excitation. Commercially available or custom-made high-current LED drivers (such as our 200 amperes RIS-796 LED pulser) can be used to drive high power LEDs as strobed fluorescence light sources as well.

We demonstrated the method by using an inexpensive surveillance KPC-650BH CCTV camera (KT&C, Korea) equipped with a 1/3” Super HAD CCD sensor (Sony, Japan) and a Sony XC-75 machine vision camera equipped with a 1/2” IT Hyper HAD CCD sensor (Sony). We also tested successfully a variety of even cheaper (∼$\$$15), uncased cameras intended for embedding in surveillance systems or flying drones, based on Sony CCD ICX sensors. The odd/even and Vsync pulses were extracted from the video analog signal using a LM1881 chip (Texas Instruments, TX) in its typical datasheet recommended application and used to strobe the two light sources via pulses of controllable duration generated by either custom electronics (simple circuitry involving logic gates and mono-stables) or an Arduino microcontroller (as a simpler, more versatile solution), fed to custom made LED or laser drivers. The analog video signal was digitized, recorded and transferred to the computer as avi or multi-page tif files via a Sony GV-D1000 digital recorder or a NI IMAQ PCI-1407 video acquisition board (NI, Austin, TX). The phase contrast/fluorescence combined video sequences were de-interlaced, separated and interpolated using custom Matlab (Mathworks, Natick, MA) scripts for off-line or custom Labview (NI, Austin, TX) programs for real-time processing and visualization.

3. Results

Typical combined video acquisitions are depicted in Figs. 3 and 4 and Visualization 3, Visualization 4, Visualization 5, Visualization 6, Visualization 7 and Visualization 8. Filament conformational changes in swimming E. coli [10] and other swimming bacteria [16] have been studied using fluorescently labeled filaments and strobed excitation with interlaced cameras. Adding synchronized phase contrast acquisition capabilities (as detailed here) allows observing and analyzing independently the motion of the non-fluorescently labeled structures (e.g. cell bodies counter-wobbling in swimming cells as in Fig. 3), or correlating the filament activity with the fluid flow they create (by using fluid tracking particles visible in phase contrast mode as in Fig. 4, also described elsewhere [14]), for little to no extra cost.

Stroboscopic illumination for two separate microscopy modes on the same camera can be an effective strategy in non-interlaced video acquisition as well. We used the same simple electronics, microcontroller setup and a slightly modified workflow and Matlab scripts to acquire synchronized phase contrast and fluorescence sequences with an Andor iXon EM CCD camera (DU-860E, Andor Technology, Belfast, UK), a high-performance solution for low-light fast microscopy imaging. This allowed us to visualize low-fluorophore-count fluorescently labeled bacterial motors in tethered E. coli cells. Tethered cells (such as the ones pointed by arrows in Fig. 6(a)) rotate around a motor attached to the substrate. Exposure times long enough to allow collecting enough light to visualize fluorescently labeled motors in immobile cells result in motion blur that prevents observing the same motors in rotating cells. Shorter exposure times that prevent the motion blur from occurring do not allow collecting enough photons to visualize the motors at a comfortable signal to noise ratio (SNR).

 figure: Fig. 6.

Fig. 6. E. coli cells with fluorescently labeled motors (FliN-YFP) imaged by strobed synchronized phase contrast/fluorescence acquisition using an EMCCD camera. The cells pointed by arrows in (a) are tethered and rotating, the rest are stuck to the substrate (darker cell bodies) or drifting (brighter cell body) slowly. Top panels: single phase contrast (a) and fluorescence (b) frame with effective exposure times of 80 µs. (c) a pseudo-colored version of the two modes (phase contrast in red, fluorescence in green) super-imposed for co-localization. See also Visualization 9 for the entire corresponding video sequence. Bottom panels: averaging about 30 non-consecutive phase contrast (d) or fluorescence (e) frames from the same sequence, in which the encircled tethered cell pointed by an arrow in (a) had the same orientation (within 30° bins) allows observing its motors with a higher SNR and insignificant motion blur; (f) super-imposed pseudo-colored version of (d) and (e). Note that the rotating cell pointed by arrow in the upper left side of the field of view in (a) appears blurry in panels (d), (e) and (f), as the two cells rotate unsynchronized, at different rates. See also Visualization 10 and Visualization 11 for video sequences depicting all averaged frames in which the cell body encircled in (a) had the same orientation in 15° and 30° bins, respectively.

Download Full Size | PDF

In the example depicted in Fig. 6 we used strobed, alternating phase contrast illumination/fluorescence excitation to obtain effective exposure times shorter than the ones allowed by the camera at its highest frame rate (here, 80 µs vs. 2 ms). The pulses were triggered by the “Fire” output signal of the camera during each exposure (high level during exposure, low level in-between exposures, fed to our microcontroller Vsync input used with interlaced cameras). This resulted in combined sequences with alternating phase contrast and fluorescence frames (as opposed to alternating interlaced fields), which could be separated in post-processing. In this example, the poor SNR of the fluorescence images exposed for 80 µs (Fig. 5(c)) could be increased by averaging non-consecutive frames in which the rotating cell had the same angular orientation, determined by tracking the cell body in the synchronized phase contrast sequence. Thus, higher SNR images with equivalent exposure times longer than the actual camera exposure (here, ∼2.4 ms, as about thirty 80 µs frames were averaged for each orientation) were generated, which allowed us to visualize the labeled motors in the rotating cell body (the green dots in Fig. 5(f)) without significant motion blur.

As in the interlaced application, the time shift between the phase contrast and the fluorescence sequence can be reduced by delaying one of the light pulses with respect to the start of the camera exposure (as shown in Fig. 5(e) and (f)). Different exposure parameters for phase and fluorescence (independent of the camera acquisition parameters, which can’t be typically changed from frame to frame in a continuous sequence acquisition) can be obtained by adjusting the intensity and/or the duration (as shown in Fig. 5(d)) of the light pulses.

The exposure and the readout modes of the EM CCD sensors are not interlaced, so the advantage of doubling the effective frame rate by sacrificing half of the vertical resolution is not inherently enabled (or obvious), but possible. Both the interlaced interline CCD and the EM CCD sensors have a similar functional design: in the EM CCD frame-transfer operating mode, the charge content of the pixel wells is also transferred to a storage area (here, a secondary sensor) through a fast process (and amplified by “electron multiplying”, EM) at the end of the whole sensor integration time, to be read slowly, line by line, during the next exposure). Binning two vertically adjacent pixel wells at the time of transferring the charge reduces the readout time (the speed-limiting process) by a factor of two, which allows doubling the frame rate. Note that the horizontal pixel binning does not typically speed-up the readout in EM CCD sensor based cameras and does not allow increasing the frame rate. A secondary advantage of vertical binning (before readout) is improving the SNR (the signal doubles, the read noise is only added once). As in the de-interlacing/interpolation approach, the binned image can be up-scaled (or interpolated) using the algorithm of choice in post-processing, with minimal loss of details in instances when the feature to be observed spans over multiple pixels.

4. Summary and discussion

Synchronizing illumination and/or excitation strobed light sources with distinct subsets of pixels on the same camera sensor is a low-cost and easy to implement solution on any fluorescence microscope. Here, we have demonstrated its utility in observing, recording and separating phase contrast and fluorescence video sequences of live samples that move or change at rates comparable to video rate, or faster. The method can be extended to other applications that require more than one illumination or excitation mode. For example, HiLO microscopy in live samples is also a method that requires successive illumination/excitation with two different patterns [17]. The method can also be extended to fluorescence video microscopy with two fluorophores with non-overlapping excitation spectra.

Alternating excitation for multi-channel fluorescence microscopy has been previously proposed for specific applications. Alternating laser excitation (ALEX) of the donor and the acceptor fluorophores is used in single molecule FRET studies aimed at quantifying in-vitro interactions between two [18,19] or more [20] fluorophores diffusing through small volumes of liquid sampled by confocal microscopy. ALEX allows acquiring information required to calculate correction factors for crosstalk at the time of data acquisition (e.g. direct acceptor excitation by donor excitation light, donor emission leakage into the acceptor channel) and evaluate donor/acceptor stoichiometry. In its original implementation, the emission path is split by wavelength and dedicated fast point detectors (avalanche photodiodes) for each type of fluorophore are used. An imaging version of it uses a TIRF microscope instead of a confocal one and a camera instead of single point detectors, with the molecules of interest immobilized on a substrate [21]. This allows simultaneous single molecule FRET measurements from multiple fluorophores. A stroboscopic ALEX TIRF version to improve on the time resolution has been proposed [22,23]. Pulsed interleaved excitation (PIE, a faster version of ALEX) alternates the excitation pulses at a much higher rate than the integration time of the detectors, allowing virtually simultaneous excitation in both channels [24]. Although some of these approaches (thoroughly reviewed in [25]) and the method we propose here share some common elements (e.g. alternating, strobed fluorescence excitation, synchronization of excitation with the camera acquisition, lack of separation of emission paths to improve on and simplify co-localization), their intended primarily applications are different: in vitro biomolecular structural dynamics on the one hand, in vivo cellular dynamics/motility studies on the other. Transmission light microscopy and perfect co-localization of fluorescently labeled and unlabeled structures are generally of little interest in in vitro single molecule fluorescence-based studies, whereas precise characterization of single molecule dynamics by fluorescence quantification is much more difficult to approach in in vivo single or multi cellular dynamics or motility studies (though sometimes desired). Alternating laser excitation for multi-color fluorescence with single emission path imaging has been proposed in a few light sheet microscopy implementations [2628], as a simpler and cost-effective alternative to multi-channel acquisition with split emission paths. Each optical slice is acquired in each excitation mode, sequentially, before moving to the next one. Interlaced excitation schemes (as described here) could speed-up the acquisition rate.

Potential biomedical applications of our method are to procedures that involve real-time scattered light/fluorescence localization and quantification (e.g., fluorescein angiography [29] or intraoperative parathyroid gland localization [30]).

We believe that future designs of scientific cameras to allow separate exposure of multiple subsets of pixels on the same sensor (grouped in lines or other patterns that allow acceptable detail recovery by interpolation) in synchrony with corresponding illumination/excitation modes would open up new imaging capabilities – faster acquisitions and better co-localization in multi-modal imaging being only the most obvious.

Funding

National Science Foundation (2146519); National Institutes of Health (AI016478); Rowland Institute at Harvard.

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper may be obtained from the authors upon reasonable request.

Supplemental document

See Supplement 1 for supporting content.

References

1. K. W. Dunn, M. M. Kamocka, and J. H. McDonald, “A practical guide to evaluating colocalization in biological microscopy,” Am. J. Physiol. Cell Physiol. 300(4), C723–C742 (2011). [CrossRef]  

2. G. J. Schutz, W. Trabesinger, and T. Schmidt, “Direct observation of ligand colocalization on individual receptor molecules,” Biophys. J. 74(5), 2223–2226 (1998). [CrossRef]  

3. R. R. Ishmukhametov, A. N. Russell, R. J. Wheeler, A. L. Nord, and R. M. Berry, “A Simple low-cost device enables four epi-illumination techniques on standard light microscopes,” Sci. Rep. 6(1), 20729 (2016). [CrossRef]  

4. L. Turner, R. Zhang, N.C. Darnton, and H.C. Berg, “Visualization of Flagella during bacterial Swarming, J Bacteriol,” J. Bacteriol. 192(13), 3259–3267 (2010). [CrossRef]  

5. B.L. Taylor, J.B. Miller, H.M. Warrick, and D.E. Koshland Jr., “Electron acceptor taxis and blue light effect on bacterial chemotaxis,” J. Bacteriol. 140(2), 567–573 (1979). [CrossRef]  

6. S. Wright, B. Walia, J. S. Parkinson, and S. Khan, “Differential activation of Escherichia coli chemoreceptors by blue-light stimuli,” J. Bacteriol. 188(11), 3962–3971 (2006). [CrossRef]  

7. T. Perlova, M. Gruebele, and Y. R. Chemla, “Blue Light Is a Universal Signal for Escherichia coli Chemoreceptors,” J. Bacteriol. 201(11), e00762 (2019). [CrossRef]  

8. H. Fukuoka, T. Sagawa, Y. Inoue, H. Takahashi, and A. Ishijima, “Direct imaging of intracellular signaling components that regulate bacterial chemotaxis,” Sci. Signal 7(319), ra32 (2014). [CrossRef]  

9. S. Blumberg, A. Gajraj, M. W. Pennington, and J. C. Meiners, “Three-dimensional characterization of tethered microspheres by total internal reflection fluorescence microscopy,” Biophys. J. 89(2), 1272–1281 (2005). [CrossRef]  

10. L. Turner, W. S. Ryu, and H. C. Berg, “Real-time imaging of fluorescent flagellar filaments,” J. Bacteriol. 182(10), 2793–2801 (2000). [CrossRef]  

11. B. G. Hosu, V. S. Nathan, and H. C. Berg, “Internal and external components of the bacterial flagellar motor rotate as a unit,” Proc. Natl. Acad. Sci. U.S.A. 113(17), 4783–4787 (2016). [CrossRef]  

12. C. Boudreau, T. L. Wee, Y. R. Duh, M. P. Couto, K. H. Ardakani, and C. M. Brown, “Excitation Light Dose Engineering to Reduce Photo-bleaching and Photo-toxicity,” Sci. Rep. 6(1), 30892 (2016). [CrossRef]  

13. B. G. Hosu and H. C. Berg, “CW and CCW Conformations of the E. coli Flagellar Motor C-Ring Evaluated by Fluorescence Anisotropy,” Biophys. J. 114(3), 641–649 (2018). [CrossRef]  

14. J. Adams, K. Parulski, and K. Spaulding, “Color processing in digital cameras,” IEEE Micro 18(6), 20–30 (1998). [CrossRef]  

15. Y. Wu, B. G. Hosu, and H. C. Berg, “Microbubbles reveal chiral fluid flows in bacterial swarms,” Proc. Natl. Acad. Sci. U.S.A. 108(10), 4147–4151 (2011). [CrossRef]  

16. B. Scharf, “Real-time imaging of fluorescent flagellar filaments of Rhizobium lupini H13-3: flagellar rotation and pH-induced polymorphic transitions,” J. Bacteriol. 184(21), 5979–5986 (2002). [CrossRef]  

17. J. Mertz and J. Kim, “Scanning light-sheet microscopy in the whole mouse brain with HiLo background rejection,” J. Biomed. Opt. 15(01), 1 (2010). [CrossRef]  

18. N. K. Lee, A. N. Kapanidis, Y. Wang, X. Michalet, J. Mukhopadhyay, R. H. Ebright, and S. Weiss, “Accurate FRET measurements within single diffusing biomolecules using alternating-laser excitation,” Biophys. J. 88(4), 2939–2953 (2005). [CrossRef]  

19. A. N. Kapanidis, N. K. Lee, T. A. Laurence, S. Doose, E. Margeat, and S. Weiss, “Fluorescence-aided molecule sorting: analysis of structure and interactions by alternating-laser excitation of single molecules,” Proc. Natl. Acad. Sci. U.S.A. 101(24), 8936–8941 (2004). [CrossRef]  

20. N. K. Lee, A. N. Kapanidis, H. R. Koh, Y. Korlann, S. O. Ho, Y. Kim, N. Gassman, S. K. Kim, and S. Weiss, “Three-color alternating-laser excitation of single molecules: monitoring multiple interactions and distances,” Biophys. J. 92(1), 303–312 (2007). [CrossRef]  

21. S. J. Holden, S. Uphoff, J. Hohlbein, D. Yadin, L. Le Reste, O. J. Britton, and A. N. Kapanidis, “Defining the limits of single-molecule FRET resolution in TIRF microscopy,” Biophys. J. 99(9), 3102–3111 (2010). [CrossRef]  

22. J. Hohlbein, T. D. Craggs, and T. Cordes, “Alternating-laser excitation: single-molecule FRET and beyond,” Chem. Soc. Rev. 43(4), 1156–1171 (2014). [CrossRef]  

23. S. Farooq and J. Hohlbein, “Camera-based single-molecule FRET detection with improved time resolution,” Phys. Chem. Chem. Phys. 17(41), 27862–27872 (2015). [CrossRef]  

24. B. K. Muller, E. Zaychikov, C. Brauchle, and D. C. Lamb, “Pulsed interleaved excitation,” Biophys. J. 89(5), 3508–3522 (2005). [CrossRef]  

25. E. Lerner, A. Barth, J. Hendrix, B. Ambrose, V. Birkedal, S. C. Blanchard, R. Borner, H. Sung Chung, T. Cordes, T. D. Craggs, A. A. Deniz, J. Diao, J. Fei, R. L. Gonzalez, I. V. Gopich, T. Ha, C. A. Hanke, G. Haran, N. S. Hatzakis, S. Hohng, S. C. Hong, T. Hugel, A. Ingargiola, C. Joo, A. N. Kapanidis, H. D. Kim, T. Laurence, N. K. Lee, T. H. Lee, E. A. Lemke, E. Margeat, J. Michaelis, X. Michalet, S. Myong, D. Nettels, T. O. Peulen, E. Ploetz, Y. Razvag, N. C. Robb, B. Schuler, H. Soleimaninejad, C. Tang, R. Vafabakhsh, D. C. Lamb, C. A. Seidel, and S. Weiss, “FRET-based dynamic structural biology: Challenges, perspectives and an appeal for open-science practices,” eLife 10, e60416 (2021). [CrossRef]  

26. T. Zhao, S. C. Lau, Y. Wang, Y. Su, H. Wang, A. Cheng, K. Herrup, N. Y. Ip, S. Du, and M. M. Loy, “Multicolor 4D Fluorescence Microscopy using Ultrathin Bessel Light Sheets,” Sci. Rep. 6(1), 26159 (2016). [CrossRef]  

27. J. Girstmair, A. Zakrzewski, F. Lapraz, M. Handberg-Thorsager, P. Tomancak, P. G. Pitrone, F. Simpson, and M. J. Telford, “Light-sheet microscopy for everyone? Experience of building an OpenSPIM to study flatworm development,” BMC Dev. Biol. 16(1), 22 (2016). [CrossRef]  

28. J. Licea-Rodriguez, A. Figueroa-Melendez, K. Falaggis, M. Plata-Sanchez, M. Riquelme, and I. Rocha-Mendoza, “Multicolor fluorescence microscopy using static light sheets and a single-channel detection,” J. Biomed. Opt. 24(1), 1 (2019). [CrossRef]  

29. F. Hui, C. T. Nguyen, P. A. Bedggood, Z. He, R. L. Fish, R. Gurrell, A. J. Vingrys, and B. V. Bui, “Quantitative spatial and temporal analysis of fluorescein angiography dynamics in the eye,” PLoS One 9(11), e111330 (2014). [CrossRef]  

30. S.W. Kim, H.S. Lee, and K.D. Lee, “Intraoperative real-time localization of parathyroid gland with near infrared fluorescence imaging,” Gland Surg. 6(5), 516–524 (2017). [CrossRef]  

Supplementary Material (12)

NameDescription
Supplement 1       Supplement 1 - supplemental text and figures
Visualization 1       Visualization 1. Interlacing artifact. Fluid-tracking particle (micro-bubble) moving around an E. coli cell body attached to an agar substrate. Phase contrast sequence (30 fps), as acquired. The still frame depicted in Fig. 2 in the main text is fra
Visualization 2       Visualization 2. Interlacing artifact. Fluid-tracking particle (micro-bubble) moving around an E. coli cell body attached to a thin agar substrate. Upper panels: left – contrast-enhanced version of Visualization 1 (30 fps); middle – pseudo-colored ve
Visualization 3       Visualization 3. E. coli with fluorescently labeled filaments swimming close to an agar substrate. Combined phase contrast/fluorescence sequence, as acquired.
Visualization 4       Visualization 4. E. coli with fluorescently labeled filaments swimming close to an agar substrate. See also Fig. 2. Top left: pseudo-colored (fluorescence in green, phase contrast in red) version of Visualization 3. Top right: combined sequence, de i
Visualization 5       Visualization 5. E. coli cell bodies attached to an agar substrate, with fluorescently labeled filaments moving fluid tracking particles. Combined phase contrast/fluorescence sequence, as acquired.
Visualization 6       Visualization 6. E. coli cell bodies attached to an agar substrate, with fluorescently labeled filaments moving fluid tracking particles. Top left: contrast-enhanced version of Visualization 5. Top right: combined sequence, de interlaced and interpol
Visualization 7       Visualization 7. E. coli cell bodies attached to a thin layer of agar substrate, with fluorescently labeled filaments moving fluid tracking particles. Combined phase contrast/fluorescence sequence, as acquired.
Visualization 8       Visualization 8. E. coli cell bodies attached to a thin layer of agar substrate, with fluorescently labeled filaments moving fluid tracking particles. Pseudo-colored synchronized representations of the combined, phase contrast and fluorescence sequen
Visualization 9       Visualization 9. E. coli cells with fluorescently labeled motors imaged by strobed synchronized phase contrast/fluorescence acquisition using an EMCCD camera. Synchronized sequences 360 consecutive phase contrast (left) and fluorescence (middle) fram
Visualization 10       Visualization 10. Each frame represents averages of about 15 non-consecutive phase contrast (left), fluorescence (middle) and pseudo-colored overlaid phase contrast/fluorescence (right) frames from Visualization 9 in which the rotating cell in the lo
Visualization 11       Visualization 11. Each frame represents averages of about 30 non-consecutive phase contrast (left), fluorescence (middle) and pseudo-colored overlaid phase contrast/fluorescence (right) frames from Visualization 9 in which the rotating cell in the lo

Data availability

Data underlying the results presented in this paper may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1. (a) Simplified phase contrast/fluorescence inverted microscope. S1: transmission light shutter; S2: fluorescence excitation shutter; Ex: excitation filter; D: long pass dichroic mirror; Em: emission filter; M: mirror. The phase rings and the condenser for phase illumination are omitted, for simplicity. We chose to represent an inverted microscope, more popular in biological laboratories; upright microscopes are upside down but the optical paths are similar. (b) Phase contrast/fluorescence imaging using two dedicated cameras. A long pass filter (LP) placed between the phase contrast light source and the sample selects wavelengths longer than the cut-on wavelength of the long-pass dichroic mirror D1 and which also do not overlap with the emission spectrum of the fluorophore. A secondary long pass dichroic mirror (D2) reflects the fluorescence emission light towards the fluorescence camera port and transmits the phase contrast light towards the phase contrast camera. (c) Single camera strobed interlaced phase contrast/fluorescence imaging. The phase and fluorescence light sources are strobed in counter-phase with each other and in phase with the corresponding field exposures of an interlaced camera. The strobe pulses are triggered by the vertical field synchronization (Vsync) pulses extracted from the video signal at the output of the camera at the time of acquisition (see text for details).
Fig. 2.
Fig. 2. Interlacing artifacts in a phase contrast-only acquisition, corrected by de-interlacing and interpolation. Fluid tracking particle (pointed by the white arrow) moving at a speed of about 90 µm/s by an E. coli cell (rod-shaped structure above the particle) attached to an agar substrate. (a) The odd and even fields (pseudo-colored in red and green, respectively) are exposed 1/60 s apart, time in which the particle traveled about 1.5 µm. The two fields are separated by de-interlacing (b) and (c) and the missing lines in each field reconstituted by interpolation (e and f). The overlapping of the de-interlaced and interpolated fields (d) shows the spherical particle as ovoid, with a green leading and red trailing edge (still frame from an interlaced video sequence published in [14]). (See also Visualization 1 and Visualization 2 for the original and processed video sequences).
Fig. 3.
Fig. 3. E. coli. with fluorescently labeled filaments swimming close to an agar surface, in a thin layer of fluid. Single frame from a combined interlaced phase contrast/fluorescence video acquisition: as acquired (a), de-interlaced and interpolated to separate fluorescence- (c) and phase-only (d) sequences, pseudo-colored in green (filaments in fluorescence) and red (cell body in phase contrast) and overlapped for enhanced co-localization (b). Phenomena such as cell body wobbling (as a result of counter-rotation with respect to the filaments), smooth curved swimming (as a result of lateral rolling of the counter-rotating cell body when close to the surface) or conformational changes of the filaments in the proximity of the surface can be observed. See also Visualization 3 and Visualization 4 for the entire original and processed video sequences.
Fig. 4.
Fig. 4. E. coli cells with cell bodies stuck to a thin layer of agar, rotating their filaments against the substrate (single frame from a combined interlaced phase contrast/fluorescence video acquisition). The frames, as acquired by the camera (a), are de-interlaced and interpolated to generate fluorescence (c) and phase contrast-only (d) sequences for separate visualization and analysis. The spherical particles, visible in the phase contrast-only field can be tracked to probe the fluid flow around the cell bodies (the rod-shaped structures) caused by the rotation of the filaments, visible in the fluorescence-only field, as described elsewhere [14]. The two sequences are pseudo-colored (phase contrast in red, fluorescence in green) and superimposed for better co-localization (b). See also Visualization 5 and Visualization 6 for the entire original and processed video sequences and Visualization 7 and Visualization 8 for another example.
Fig. 5.
Fig. 5. Strobed phase contrast/fluorescence excitation timing diagrams. Oscilloscope traces (top to bottom in each panel): yellow (trace 1): composite video signal, as output by the interlaced camera; green (trace 2): vertical synchronization pulses; red (trace 4) field 2 exposure pulses; blue (trace 3): field 1 exposure pulses. In each field, the light is turned ON when the signal levels are high. (a) Dual strobed phase contrast illumination/fluorescence excitation. Each light source is ON for 4 ms at the beginning of each field, at about 30 fps, with a time shift of ∼16.7 ms between sequences. Phase contrast-only (b) or fluorescence-only (c) acquisitions at twice the frame rate (∼60 fps). (d) Dual strobed phase contrast/fluorescence acquisition with different effective exposure times (2 ms phase contrast and 12 ms fluorescence). (e) Dual strobed phase contrast/fluorescence acquisition with minimized time shift between sequences (here, reduced to 676 µs). Field 1 pulses are labeled 1A, 1B and 1C, field 2 pulses labeled 2A and 2B. Pulse 2A is applied towards the end if its field, pulse 1A at its beginning (f). Time-enlarged representation of panel (e) in the vicinity of pulses 1A and 2A. See also Fig. S.3 in Supplement 1.
Fig. 6.
Fig. 6. E. coli cells with fluorescently labeled motors (FliN-YFP) imaged by strobed synchronized phase contrast/fluorescence acquisition using an EMCCD camera. The cells pointed by arrows in (a) are tethered and rotating, the rest are stuck to the substrate (darker cell bodies) or drifting (brighter cell body) slowly. Top panels: single phase contrast (a) and fluorescence (b) frame with effective exposure times of 80 µs. (c) a pseudo-colored version of the two modes (phase contrast in red, fluorescence in green) super-imposed for co-localization. See also Visualization 9 for the entire corresponding video sequence. Bottom panels: averaging about 30 non-consecutive phase contrast (d) or fluorescence (e) frames from the same sequence, in which the encircled tethered cell pointed by an arrow in (a) had the same orientation (within 30° bins) allows observing its motors with a higher SNR and insignificant motion blur; (f) super-imposed pseudo-colored version of (d) and (e). Note that the rotating cell pointed by arrow in the upper left side of the field of view in (a) appears blurry in panels (d), (e) and (f), as the two cells rotate unsynchronized, at different rates. See also Visualization 10 and Visualization 11 for video sequences depicting all averaged frames in which the cell body encircled in (a) had the same orientation in 15° and 30° bins, respectively.

Equations (1)

Equations on this page are rendered with MathJax. Learn more.

τ = t 1 2 + t b e + t b i + t 2 2
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.