Abstract

A novel optical coherence tomography system that can perform scanless two dimensional imaging without Fourier transform is proposed and demonstrated. In the system, a convex cylindrical mirror generates an extended spatial distribution of optical delay in the reference arm and a cylindrical lens is used to form a focused line beam in the sample arm. A charge-coupled device camera captures the two dimensional tomographic image of a sample in a snap-shot manner. Due to its simple configuration and operation, the system is suitable for developing a compact device for tomographic imaging and measurement.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Optical coherence tomography (OCT) has become one of the rapidly emerging optical imaging techniques for clinical applications in the past decades [1,2]. On the other hand, in the development of compact devices, requirements such as simple design and operation without mechanical motion are needed to be implemented [35]. Therefore, an OCT system with simple configuration and operation will be potentially suitable for development of new type of compact devices and be benefit to personal healthcare as well as industrial applications [6].

OCT is a noninvasive tomographic imaging technique with micron-resolution [7]. In the early development of OCT, most of the systems were constructed in time-domain in which axial scanning mechanisms in the reference arm were needed and the scanning mechanisms may limit the imaging speed and range. In contrast to time-domain OCT (TD-OCT), the Fourier-domain OCT (FD-OCT) can perform cross-sectional imaging without axial scan in the reference arm where the optical delay component is replaced by a fixed reflection mirror and the photodetector is replaced by a spectrometer [8,9]. This type of OCT is also called spectral-domain OCT (SD-OCT) in which the cross-sectional image is obtained after a Fourier transform of the spectral interferogram measured by the spectrometer. Recently, another type of FD-OCT was developed in which a time-encoded swept source was used as the light source [10,11]. This type of OCT is also called swept-source OCT (SS-OCT) in which a photodetector measures the spectral components during the scanning period for construction of the spectral interferogram.

In comparison to TD-OCT, FD-OCT possesses the advantages of high speed and high sensitivity. However, the imaging range in axial direction of FD-OCT is limited by the resolution of spectral interferogram. Furthermore, mirror image and autocorrelation noise are resulted from Fourier transform of the spectral interferogram that need to be resolved by post processing methods [1214]. Recently, various full-field and single-shot OCT techniques were proposed to improve the imaging speed by eliminating scanning. However, most of these techniques can only provide en face images or need a Fourier transform in the image processing [1522]. In addition, grating based scanless optical delay based on Littrow configuration that uses either spot or line configuration based sample scanning in time domain configuration was demonstrated by several groups [23,24].

In this paper, a spatial-domain OCT system that can perform scanless two dimensional imaging without Fourier transform is proposed and demonstrated. Because of its simple configuration and operation as well as no scanning or moving part in the system, it has the potential for the development of a compact OCT system by using a compact light source.

2. Experimental setup

Figure 1 shows the layout and experimental setup of the scanless transform-free spatial-domain OCT system. A mode-locked Ti:sapphire laser (FEMTOLASERS, INTEGRAL PRO) with output power of 122 mW was used as the light source. The central wavelength and full width at half maximum of the output spectrum were 800 nm and 138 nm, respectively. An adjustable neutral-density filter (NDF) was introduced to adjust the laser power and a beam expander was used to expand the laser beam. The power of the laser beam incident on the beamsplitter was measured to be 87 mW. The laser beam was then split with a beamsplitter into sample and reference beams. In the reference arm, an NDF was introduced to adjust the power of reference beam and a dispersion compensator was used to compensate the dispersion mismatch between the sample and reference beams. A convex cylindrical mirror was used to reflect and expand the reference beam that resulted in an extended spatial distribution of optical delay along the Z-direction in Fig. 1. The portion of reflected reference beam that would not be incident within the sensing range of the charge-coupled device (CCD) camera was blocked. In the sample arm, a cylindrical lens with focal length of 50 mm was introduced to focus the sample beam onto the sample in the Y-direction to form a focused line beam along the X-direction in Fig. 1. Another NDF was introduced between the beamsplitter and cylindrical lens in the sample arm when a sample with high reflectivity was imaged. At the detection end, the two-dimensional tomographic image of the sample was captured with a CCD camera (Basler, piA1900-32gm) with a bit depth and a full-well-capacity of 12 bits and 40,000 -, respectively.

 

Fig. 1. The (a) layout and (b) experimental setup of the spatial-domain OCT system. L, light source; NDF1, NDF2 and NDF3, neutral-density filters; BE, beam expander; BS, beamsplitter; BB, beam block; DC, dispersion compensator; CM, cylindrical mirror; CL, cylindrical lens; SS, sample stage; CCD, CCD camera.

Download Full Size | PPT Slide | PDF

In such an arrangement, the distribution of optical delay of the reflected reference beam along the Z-direction in Fig. 1 is effectively equivalent to the axial scan in TD-OCT in a snap-shot manner. Therefore, the interferogram of each A-scan is directly presented on each horizontal line of pixels of the CCD camera. On the other hand, different pixel in a vertical line (along the X-direction in Fig. 1) of the CCD camera corresponds to different lateral position of the sample with same optical delay. Therefore, the information of lateral sample structure at the same depth is presented on each vertical line of pixels of the CCD camera.

3. Analysis of optical delay

In the reference arm of our system, a cylindrical mirror was used to generate a distribution of optical delay of the reference beam. In the arrangement illustrated in Fig. 2(a), a ray incident along the central axis of the cylindrical mirror will be reflected at B and will impinge the CCD camera at A. A ray parallel to the central axis of the cylindrical mirror and incident on the cylindrical mirror at D will be reflected along DE and will impinge the CCD camera at E. Therefore, the optical delay can be defined as the path difference between the two rays and depends on the incident position on the cylindrical mirror and can be represented as

$$\begin{aligned} \textrm{Optical delay} &= CD + DE - a\\ &= \frac{{2\textrm{[(}{R^\textrm{2}} - {d^2}\textrm{)(}R - \sqrt {{R^2} - {d^2}} \textrm{)} + a{d^2}\textrm{]}}}{{{R^2} - 2{d^2}}}, \end{aligned}$$
where a is the distance between the CCD camera and surface of the cylindrical mirror along its central axis, i.e., AB, R is the radius of curvature of the cylindrical mirror and d is the deviation of incident ray relative to the central axis, i.e., BC. The influence of optical components between the cylindrical mirror and the CCD camera is neglected. The resulting optical delay increases as the deviation d increases and as the radius of curvature of the cylindrical mirror R decreases. However, the width of reflected reference beam incident on the CCD camera must be limited within the sensing range of the CCD camera. For a ray incident on the cylindrical mirror at D in Fig. 2(a), the reflected ray will impinge the CCD camera at E and the distance between A and E can be represented as
$$\begin{aligned} AE &= d + \textrm{(}AB + CD\textrm{)}\tan 2\theta \\ &= \frac{{2d\textrm{(}a + R\textrm{)}\sqrt {{R^2} - {d^2}} - {R^2}d}}{{{R^2} - 2{d^2}}}. \end{aligned}$$

 

Fig. 2. (a) Configuration for calculation of the relation between the optical delay of the reference beam and its incident position on the cylindrical mirror. CM, cylindrical mirror; CCD, CCD camera. (b) The relation between the optical delay of the reference beam and its incident position on the CCD camera as well as the horizontal pixel number of the CCD camera when a cylindrical mirror with different radii of curvature is used.

Download Full Size | PPT Slide | PDF

The above equation provides the upper limit of deviation d such that AE will not be larger than the horizontal sensing range of the CCD camera.

In our system, the distance between the CCD camera and the cylindrical mirror is about 135 mm. The pixel size of the CCD camera is 7.4 µm × 7.4 µm and the pixel number is 1920 × 1080 such that the sensing range is about 14 mm × 8 mm. By using Eqs. (1) and (2), the relation between the resulting optical delay of the reference beam and its incident position on the CCD camera, i.e., AE, can be estimated for different values of the radius of curvature of the cylindrical mirror and is shown in Fig. 2(b). It can be seen that the resulting optical delay can be between 490 and 607 µm when the radius of curvature of the cylindrical mirror is in the range between 11.9 mm and 91.9 mm. In our system, the radius of curvature of the cylindrical mirror was chosen to be 51.9 mm such that a maximum optical delay of 557 µm was expected to be generated. The reason for using a cylindrical mirror instead of a tilted plane mirror is twofold. First, the resulting optical delay decreases as the radius of curvature of the cylindrical mirror increases as mentioned above. Therefore, a tilted plane mirror is difficult to result in an optical delay with a practical range. Second, the reflected reference beam will be difficult to effectively overlap with the reflected sample beam if a tilted plane mirror is used. Although the relation between the optical delay and horizontal pixel number of the CCD camera is not linear, the nonlinearity is minor and can be calibrated with interpolation and resampling.

4. Image processing

The interference signal captured by the CCD camera in the system can be represented as

$${I_\textrm{O}} = {I_\textrm{S}} + {I_\textrm{R}} + 2{\mathop{\rm Re}\nolimits} [{E_\textrm{S}}E_\textrm{R}^\ast ] + {I_\textrm{B}},$$
where IS and IR are DC terms resulted from autocorrelation of the sample and reference beams, ES and ER are the electric fields from the sample and reference arms, respectively. The third term in Eq. (3) represents the cross-correlation of the sample and reference beams and thus corresponds to the tomographic structure information of the sample. IB in the above equation denotes the background signal resulted from reflection at the interfaces of the optical components in the system and the interference between those reflected beams. The image quality can be enhanced by removing the background signal IB because it is not uniform throughout the image plane. When the sample beam is blocked before impinging the sample, the signal captured by the CCD camera can be represented as
$${I_1} = {I_\textrm{R}} + {I_\textrm{B}}.$$
Similarly, when the reference beam is blocked before impinging the cylindrical mirror, the signal captured by the CCD camera can be represented as
$${I_2} = {I_\textrm{S}} + {I_\textrm{B}}.$$
The background signal IB can be obtained by blocking both of the sample and reference beams before impinging the sample and the cylindrical mirror, respectively. Therefore, the quality-enhanced image can then be obtained by using the equation
$$I = |{I_\textrm{O}} - {I_\textrm{1}} - {I_\textrm{2}} + {I_\textrm{B}}|.$$
Note that an absolute value of the signal is taken to obtain the final image instead of applying Hilbert transform for demodulation of the interferogram such that the processing speed can be significantly improved without any loss of information. Figures 3(a)–(d) show the signals IO, I1, I2 and IB measured with the system where a reflection mirror was used as the sample. The exposure time of the CCD camera was set to 30 ms and the frame rate was 30 fps. Figures 3(e)–(h) show the corresponding one dimensional intensity distribution of the portion indicated with the dashed lines in Figs. 3(a)–(d). The OCT image of the reflection mirror after image processing using Eq. (6) is shown in Fig. 4(a). Figure 4(b) shows the processed one-dimensional intensity distribution indicated with the dashed line in Fig. 4(a) from which the signal-to-noise ratio (SNR) can be estimated to be [25]
$$SNR = 20\log \left( {\frac{{{I_{samp}}}}{{{\sigma_{bg}}}}} \right) = 20\log \left( {\frac{{1904}}{{11.47}}} \right) = 44.40\textrm{ dB}.$$
The SNR of the OCT image can be further improved with sum of multiple measurements in which the contribution of random noise is effectively suppressed. Figure 5(a) shows the relation between the resulted SNR and the number of measurements where the mean values and standard deviations in 20 repeated experiments are presented. It can be seen that the SNR is significantly improved with increasing number of measurements firstly, and approaches to a constant as the number of measurements is larger than six. The average of ten OCT images of the reflection mirror after image processing with Eq. (6) is shown in Fig. 5(b). Figure 5(c) shows the processed one-dimensional intensity distribution indicated with the dashed line in Fig. 5(b) from which the full width at half maximum is estimated to be about 2.2 µm that is comparable to the limit of axial resolution given by the light source, and the SNR can be estimated to be
$$SNR = 20\log \left( {\frac{{{I_{samp}}}}{{{\sigma_{bg}}}}} \right) = 20\log \left( {\frac{{1828}}{{6.75}}} \right) = 48.65\textrm{ dB}.$$

 

Fig. 3. (a) IO, (b) I1, (c) I2 and (d) IB measured by the CCD camera where a reflection mirror was used as the sample. (e-h) The corresponding one dimensional intensity distribution of the portion indicated with the dashed lines in (a)-(d).

Download Full Size | PPT Slide | PDF

 

Fig. 4. (a) OCT image of a reflection mirror after image processing. (b) The one-dimensional intensity distribution indicated with the dashed line in (a).

Download Full Size | PPT Slide | PDF

 

Fig. 5. (a) The relation between resulted SNR and the number of measurements. (b) The average of ten OCT images of a reflection mirror after image processing. (c) The one-dimensional intensity distribution indicated with the dashed line in (b).

Download Full Size | PPT Slide | PDF

The side lobes of the interference signal in Fig. 5(c) are supposed to result from the output spectrum of the light source and the dispersion mismatch between the sample and reference beams. In this experiment, an NDF with optical density of 3.6 dB was introduced in the sample arm to reduce the signal power within the dynamic range of the CCD camera. The incident power on the reflection mirror was estimated to be about 0.01 mW. When the effect of the NDF is taken into account, the maximum available SNR of the system can be estimated to achieve 120.65 dB [25]. Therefore, the sensitivity of the system can be further improved when a camera with higher dynamic range is used. It is important to realize that due to the effect of speckle cross-talk as a spatially coherent Ti: sapphire laser is used as the light source, the sensitivity and dynamic range of the system will be reduced when imaging biological specimens.

5. Image calibration and determination of specifications

Aberration is caused by the curvature of the cylindrical mirror in the reference arm and will result in artifacts in the reconstructed OCT images. In order to verify the relation between the optical delay of the reference beam and horizontal pixel number of the CCD camera, and thus to obtain the basis for calibration for nonlinearity such that the artifacts in the reconstructed OCT images can be compensated, a series of OCT images of a reflection mirror placed on a translation stage for longitudinal movement in the sample arm were measured. The longitudinal displacement of the reflection mirror between two successive measurements was 10 µm and the total displacement was 500 µm. All of the OCT images of the reflection mirror at different positions were merged as shown in Fig. 6(a). The position of each intensity peak was identified as the position of surface of the reflection mirror such that the relation between the axial depth in a sample that corresponds to optical delay and the horizontal pixel number of the CCD camera can be obtained as shown in Fig. 6(b). This relation is consistent with that shown in Fig. 2(b). The nonlinearity can be calibrated with interpolation and resampling to obtain a linearized relation as shown in Fig. 6(c) and (d). The same method of interpolation and resampling was applied to other measured OCT images to obtain linearized images. The imaging range in the axial direction is limited by the optical delay component and the sensing range of the CCD camera and can be improved by using a cylindrical mirror with larger curvature in the reference arm, or using a CCD camera with larger sensing range. However, because the sampling density is not uniform along the axial depth, undersampling may occur at a large axial depth.

 

Fig. 6. Merged image of a series of OCT images of a reflection mirror at different axial positions before (a) and after (c) calibration of nonlinearity. The relation between axial depth in a sample and the horizontal pixel number of CCD camera before (b) and after (d) calibration of nonlinearity.

Download Full Size | PPT Slide | PDF

Although the achievable SNR and axial resolution of the system have been verified as discussed above, the SNR and axial resolution for a sample at different axial positions are also limited by the point spread function of the focused sample beam and the dispersion mismatch between the sample and reference beams, respectively. Figure 7(a) shows the one-dimensional intensity distributions of the images of a reflection mirror at different axial positions. The SNR and axial resolution for a sample at different axial depths are shown in Fig. 7(b) and (c), respectively. One can see that the maximum SNR occurs at the axial position near the focal plane of the cylindrical lens. The variation of axial resolution may result from the dispersion mismatch between the sample and reference beams.

 

Fig. 7. (a) One-dimensional intensity distributions of the OCT images of a reflection mirror at different axial positions. (b) SNR and (c) axial resolution for a sample at different axial depths.

Download Full Size | PPT Slide | PDF

The lateral imaging range and resolution of the system were measured by using a 1951 USAF resolution test target (Edmund Optics, Stock Number 38-257) as a sample as shown in Fig. 8(a). Figure 8(b) shows the image of a portion of group 0 of the resolution test target when the reference beam was blocked. The line width of Element 5 is 314 µm that corresponds to 48 pixels on the CCD camera. Therefore, the lateral imaging range of the system is estimated to be about 7 mm as there are 1080 pixels in the vertical direction on the CCD camera. With a plano-concave cylindrical lens placed in front of the CCD camera to enlarge the image in the lateral direction, higher lateral resolution can be achieved at the expense of reducing the lateral imaging range. The effect of magnification in the lateral direction is presented with the images of Elements 3 and 4 of Group 5 of the resolution test target as shown in Figs. 8(c) and (d), respectively. The lines with diffraction fringes can be identified such that the lateral resolution of the system can be estimated to be beyond 11 µm.

 

Fig. 8. (a) Image of the 1951 USAF resolution test target. (b) Image of the marked region of Group 0 when the reference beam was blocked. (c-d) Images of Elements 3 and 4 of Group 5, respectively, where a plano-concave cylindrical lens was placed in front of the CCD camera to enlarge the image in the lateral direction.

Download Full Size | PPT Slide | PDF

As mentioned above, the lateral resolution can be improved at the expense of reducing the lateral imaging range with a plano-concave cylindrical lens placed in front of the CCD camera to enlarge the image in the lateral direction. By varying the radius of curvature or position of the plano-concave cylindrical lens, different lateral magnifications are achieved that result in different lateral resolutions and imaging ranges. Figure 9 shows the relation between the magnification, resolution and the imaging range in the lateral direction.

 

Fig. 9. The relation between the magnification, resolution and the imaging range in the lateral direction when a plano-concave cylindrical lens placed in front of the CCD camera.

Download Full Size | PPT Slide | PDF

6. Results

A leaf of Ficus subpisocarpa and human fingertips were imaged with the system where a plano-concave cylindrical lens was placed in front of the CCD camera to enlarge the image in the lateral direction (the X-direction in Fig. 1, or the vertical direction on the CCD camera) such that the details of the images can be clearly identified. The effect of introducing the plano-concave cylindrical lens is a laterally magnified projection of a portion of the image onto the CCD camera without affecting the image in the axial direction (the Z-direction in Fig. 1, or the horizontal direction on the CCD camera). Therefore, the field of view in the lateral direction is reduced without significantly affecting the SNR of the image. In the imaging of the leaf of Ficus subpisocarpa, an NDF with optical density of 0.3 dB was introduced in the sample arm and the incident power on the sample was about 21 mW. In the imaging of fingertips, the NDF was removed and the incident power on the sample was about 42 mW. The exposure time of the CCD camera was set to 50 ms and the frame rate was 20 fps. After compensating for the aberration due to the curvature of the reference arm mirror and dispersion, Fig. 10 shows the OCT image and microscopic image of a leaf of Ficus subpisocarpa, respectively [26,27]. In the OCT image of the leaf of Ficus subpisocarpa, the structures of cuticle proper, epidermal cells, vascular bundle and spongy parenchyma were clearly observed and consistent with the microscopic image [28]. Figure 11 shows the images of human fingertips in which the structure of epidermis that reveals the profile of fingerprint, and sweat duct can be identified as shown in Fig. 11(a). When the focal plane of the cylindrical lens in the sample arm was moved deeper inside the sample, the structure of dermis was also observed as shown in Fig. 11(b). Since the movement of the cylindrical lens only changes the position of the focal plane, the lateral field of view determined by the length of the focused line beam is not affected, and the distribution of SNR is similar to that shown in Fig. 7 with a lateral shift where the highest SNR occurs near the focal plane. However, there are discontinuities and speckles in the image of the finger surface that are supposed to result from the cross-talk and multiple scattering as well as the movement of sample during measurement. These effects commonly occur in parallel or full field configuration of OCT with either time-domain or Fourier-domain configuration, especially a spatially coherent light source such as a Ti: sapphire laser is used to image biological specimens [2932]. These artifacts can directly affect the system sensitivity and the dynamic range of the full-field OCT for practical applications and are more severe in the out-of-focus illumination region.

 

Fig. 10. (a) OCT image and (b) microscopic image of a leaf of Ficus subpisocarpa. CP, cuticle proper; EC, epidermal cells; VB, vascular bundle; SP, spongy parenchyma. The scale bars correspond to 100 µm.

Download Full Size | PPT Slide | PDF

 

Fig. 11. (a) OCT image of a human fingertip. ED, epidermis; SD, sweat duct. (b) OCT image of a human fingertip when the focal plane of the cylindrical lens was moved deeper inside the sample. DE, dermis. The scale bars correspond to 100 µm.

Download Full Size | PPT Slide | PDF

7. Conclusions

A transform-free spatial-domain OCT system was proposed and demonstrated to perform two dimensional tomographic imaging without any scanning mechanism in the system. Since it is a spatial-domain system, no Fourier transform or other post processing for mirror image is needed. The imaging range of the system can be as large as 7 mm × 500 µm. The axial resolution was estimated to be 2.2 µm. With a plano-concave cylindrical lens placed in front of the CCD camera to enlarge the image in the lateral direction, the lateral resolution of the system can be estimated to be beyond 11 µm. The measured SNR is 48.65 dB when a reflection mirror was used as the sample and an NDF with optical density of 3.6 dB was introduced in the sample arm. The sensitivity is expected to be further improved by using a camera with higher dynamic range. The system can perform real-time imaging where the imaging speed is completely determined by the frame rate of the camera. Because of its simplicity in configuration and with no moving part within it, the system is ideal for miniature design to construct a compact device such that it is suitable for further biomedical applications.

Funding

Ministry of Science and Technology, Taiwan (102-2632-M-033-001-MY3, 108-2112-M-033-007, 99-2627-B-033-003).

Disclosures

The authors declare that there are no conflicts of interest related to this article.

References

1. W. Drexler, M. Liu, A. Kumar, T. Kamali, A. Unterhuber, and R. A. Leitgeb, “Optical coherence tomography today: speed, contrast, and multimodality,” J. Biomed. Opt. 19(7), 071412 (2014). [CrossRef]  

2. R. Dsouza, J. Won, G. L. Monroy, D. R. Spillman, and S. A. Boppart, “Economical and compact briefcase spectral-domain optical coherence tomography system for primary care and point-of-care applications,” J. Biomed. Opt. 23(09), 1 (2018). [CrossRef]  

3. A. Pantelopoulos and N. G. Bourbakis, “A survey on wearable sensor-based systems for health monitoring and prognosis,” IEEE Trans. Syst. Man Cybern. Part C 40(1), 1–12 (2010). [CrossRef]  

4. D. Raskovic, T. Martin, and E. Jovanov, “Medical monitoring applications for wearable computing,” Comput. J. 47(4), 495–504 (2004). [CrossRef]  

5. U. Anliker, J. Beutel, M. Dyer, R. Enzler, P. Lukowicz, L. Thiele, and G. Troster, “A systematic approach to the design of distributed wearable systems,” IEEE Trans. Comput. 53(8), 1017–1033 (2004). [CrossRef]  

6. N. H. Cho, K. Park, J. Y. Kim, Y. Jung, and J. Kim, “Quantitative assessment of touch-screen panel by nondestructive inspection with three-dimensional real-time display optical coherence tomography,” Opt. Lasers Eng. 68, 50–57 (2015). [CrossRef]  

7. D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, and C. A. Puliafito, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991). [CrossRef]  

8. A. F. Fercher, C. K. Hitzenberger, G. Kamp, and S. Y. El-Zaiat, “Measurement of intraocular distances by backscattering spectral interferometry,” Opt. Commun. 117(1-2), 43–48 (1995). [CrossRef]  

9. B. Potsaid, I. Gorczynska, V. J. Srinivasan, Y. Chen, J. Jiang, A. Cable, and J. G. Fujimoto, “Ultrahigh speed spectral/Fourier domain OCT ophthalmic imaging at 70,000 to 312,500 axial scans per second,” Opt. Express 16(19), 15149–15169 (2008). [CrossRef]  

10. S. R. Chinn, E. A. Swanson, and J. G. Fujimoto, “Optical coherence tomography using a frequency-tunable optical source,” Opt. Lett. 22(5), 340–342 (1997). [CrossRef]  

11. F. Lexer, C. K. Hitzenberger, A. F. Fercher, and M. Kulhavy, “Wavelength-tuning interferometry of intraocular distances,” Appl. Opt. 36(25), 6548–6553 (1997). [CrossRef]  

12. M. A. Choma, M. V. Sarunic, C. Yang, and J. A. Izatt, “Sensitivity advantage of swept source and Fourier domain optical coherence tomography,” Opt. Express 11(18), 2183–2189 (2003). [CrossRef]  

13. R. A. Leitgeb, C. K. Hitzenberger, A. F. Fercher, and T. Bajraszewski, “Phase-shifting algorithm to achieve high-speed long-depth-range probing by frequency-domain optical coherence tomography,” Opt. Lett. 28(22), 2201–2203 (2003). [CrossRef]  

14. M. Zhang, L. Ma, and P. Yu, “Spatial convolution for mirror image suppression in Fourier domain optical coherence tomography,” Opt. Lett. 42(3), 506–509 (2017). [CrossRef]  

15. Y. Yasuno, T. Endo, S. Makita, G. Aoki, M. Itoh, and T. Yatagai, “Three-dimensional line-field Fourier domain optical coherence tomography for in vivo dermatological investigation,” J. Biomed. Opt. 11(1), 014014 (2006). [CrossRef]  

16. E. Beaurepaire, A. C. Boccara, M. Lebec, L. Blanchot, and H. Saint-Jalmes, “Full-field optical coherence microscopy,” Opt. Lett. 23(4), 244–246 (1998). [CrossRef]  

17. C. Dunsby, Y. Gu, and P. M. W. French, “Single-shot phase-stepped wide-field coherence-gated imaging,” Opt. Express 11(2), 105–115 (2003). [CrossRef]  

18. S. Witte, M. Baclayon, E. J. Peterman, R. F. Toonen, H. D. Mansvelder, and M. L. Groot, “Single-shot two-dimensional full-range optical coherence tomography achieved by dispersion control,” Opt. Express 17(14), 11335–11349 (2009). [CrossRef]  

19. T. Butler, S. Slepneva, B. O’Shaughnessy, B. Kelleher, D. Goulding, S. P. Hegarty, H.-C. Lyu, K. Karnowski, M. Wojtkowski, and G. Huyet, “Single shot, time-resolved measurement of the coherence properties of OCT swept source lasers,” Opt. Lett. 40(10), 2277–2280 (2015). [CrossRef]  

20. P. Koch, G. Hüttmann, H. Schleiermacher, J. Eichholz, and E. Koch, “Linear optical coherence tomography system with a downconverted fringe pattern,” Opt. Lett. 29(14), 1644–1646 (2004). [CrossRef]  

21. T. U. Nguyen, M. C. Pierce, L. Higgins, and T. S. Tkaczyk, “Snapshot 3D optical coherence tomography system using image mapping spectrometry,” Opt. Express 21(11), 13758–13772 (2013). [CrossRef]  

22. H. M. Subhash, “Full-field and single-shot full-field optical coherence tomography: a novel technique for biomedical imaging applications,” Adv. Opt. Tech. 2012, 1–26 (2012). [CrossRef]  

23. I. Zeylikovich, A. Gilerson, and R. R. Alfano, “Nonmechanical grating-generated scanning coherence microscopy,” Opt. Lett. 23(23), 1797–1799 (1998). [CrossRef]  

24. Y. Watanabe, K. Yamada, and M. Sato, “Three-dimensional imaging by ultrahigh-speed axial-lateral parallel time domain optical coherence tomography,” Opt. Express 14(12), 5201–5209 (2006). [CrossRef]  

25. A. Agrawal, T. J. Pfefer, P. D. Woolliams, P. H. Tomlins, and G. Nehmetallah, “Methods to assess sensitivity of optical coherence tomography systems,” Biomed. Opt. Express 8(2), 902–917 (2017). [CrossRef]  

26. I. J. Hsu, C. W. Sun, C. W. Lu, C. C. Yang, C. P. Chiang, and C. W. Lin, “Resolution improvement with dispersion manipulation and a retrieval algorithm in optical coherence tomography,” Appl. Opt. 42(2), 227–234 (2003). [CrossRef]  

27. I. J. Hsu, C. W. Sun, C. W. Lu, C. C. Yang, C. P. Chiang, and C. W. Lin, “Process algorithms for resolution improvement and contrast enhancement in optical coherence tomography,” Opt. Rev. 10(6), 567–571 (2003). [CrossRef]  

28. B. Chantarasuwan, P. Baas, B. J. van Heuven, C. Baider, and P. C. van Welzen, “Leaf anatomy of Ficus subsection Urostigma (Moraceae),” Bot. J. Linn. Soc. 175(2), 259–281 (2014). [CrossRef]  

29. P. Stremplewski, E. Auksorius, P. Wnuk, Ł Kozoń, P. Garstecki, and M. Wojtkowski, “In vivo volumetric imaging by crosstalk-free full-field OCT,” Optica 6(5), 608–617 (2019). [CrossRef]  

30. B. Karamata, P. Lambelet, M. Laubscher, R. P. Salathé, and T. Lasser, “Spatially incoherent illumination as a mechanism for cross-talk suppression in wide-field optical coherence tomography,” Opt. Lett. 29(7), 736–738 (2004). [CrossRef]  

31. A. H. Dhalla, J. V. Migacz, and J. A. Izatt, “Crosstalk rejection in parallel optical coherence tomography using spatially incoherent illumination with partially coherent sources,” Opt. Lett. 35(13), 2305–2307 (2010). [CrossRef]  

32. J. Ogien and A. Dubois, “A compact high-speed full-field optical coherence microscope for high-resolution in vivo skin imaging,” J. Biophotonics 12(2), e201800208 (2019). [CrossRef]  

References

  • View by:
  • |
  • |
  • |

  1. W. Drexler, M. Liu, A. Kumar, T. Kamali, A. Unterhuber, and R. A. Leitgeb, “Optical coherence tomography today: speed, contrast, and multimodality,” J. Biomed. Opt. 19(7), 071412 (2014).
    [Crossref]
  2. R. Dsouza, J. Won, G. L. Monroy, D. R. Spillman, and S. A. Boppart, “Economical and compact briefcase spectral-domain optical coherence tomography system for primary care and point-of-care applications,” J. Biomed. Opt. 23(09), 1 (2018).
    [Crossref]
  3. A. Pantelopoulos and N. G. Bourbakis, “A survey on wearable sensor-based systems for health monitoring and prognosis,” IEEE Trans. Syst. Man Cybern. Part C 40(1), 1–12 (2010).
    [Crossref]
  4. D. Raskovic, T. Martin, and E. Jovanov, “Medical monitoring applications for wearable computing,” Comput. J. 47(4), 495–504 (2004).
    [Crossref]
  5. U. Anliker, J. Beutel, M. Dyer, R. Enzler, P. Lukowicz, L. Thiele, and G. Troster, “A systematic approach to the design of distributed wearable systems,” IEEE Trans. Comput. 53(8), 1017–1033 (2004).
    [Crossref]
  6. N. H. Cho, K. Park, J. Y. Kim, Y. Jung, and J. Kim, “Quantitative assessment of touch-screen panel by nondestructive inspection with three-dimensional real-time display optical coherence tomography,” Opt. Lasers Eng. 68, 50–57 (2015).
    [Crossref]
  7. D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, and C. A. Puliafito, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
    [Crossref]
  8. A. F. Fercher, C. K. Hitzenberger, G. Kamp, and S. Y. El-Zaiat, “Measurement of intraocular distances by backscattering spectral interferometry,” Opt. Commun. 117(1-2), 43–48 (1995).
    [Crossref]
  9. B. Potsaid, I. Gorczynska, V. J. Srinivasan, Y. Chen, J. Jiang, A. Cable, and J. G. Fujimoto, “Ultrahigh speed spectral/Fourier domain OCT ophthalmic imaging at 70,000 to 312,500 axial scans per second,” Opt. Express 16(19), 15149–15169 (2008).
    [Crossref]
  10. S. R. Chinn, E. A. Swanson, and J. G. Fujimoto, “Optical coherence tomography using a frequency-tunable optical source,” Opt. Lett. 22(5), 340–342 (1997).
    [Crossref]
  11. F. Lexer, C. K. Hitzenberger, A. F. Fercher, and M. Kulhavy, “Wavelength-tuning interferometry of intraocular distances,” Appl. Opt. 36(25), 6548–6553 (1997).
    [Crossref]
  12. M. A. Choma, M. V. Sarunic, C. Yang, and J. A. Izatt, “Sensitivity advantage of swept source and Fourier domain optical coherence tomography,” Opt. Express 11(18), 2183–2189 (2003).
    [Crossref]
  13. R. A. Leitgeb, C. K. Hitzenberger, A. F. Fercher, and T. Bajraszewski, “Phase-shifting algorithm to achieve high-speed long-depth-range probing by frequency-domain optical coherence tomography,” Opt. Lett. 28(22), 2201–2203 (2003).
    [Crossref]
  14. M. Zhang, L. Ma, and P. Yu, “Spatial convolution for mirror image suppression in Fourier domain optical coherence tomography,” Opt. Lett. 42(3), 506–509 (2017).
    [Crossref]
  15. Y. Yasuno, T. Endo, S. Makita, G. Aoki, M. Itoh, and T. Yatagai, “Three-dimensional line-field Fourier domain optical coherence tomography for in vivo dermatological investigation,” J. Biomed. Opt. 11(1), 014014 (2006).
    [Crossref]
  16. E. Beaurepaire, A. C. Boccara, M. Lebec, L. Blanchot, and H. Saint-Jalmes, “Full-field optical coherence microscopy,” Opt. Lett. 23(4), 244–246 (1998).
    [Crossref]
  17. C. Dunsby, Y. Gu, and P. M. W. French, “Single-shot phase-stepped wide-field coherence-gated imaging,” Opt. Express 11(2), 105–115 (2003).
    [Crossref]
  18. S. Witte, M. Baclayon, E. J. Peterman, R. F. Toonen, H. D. Mansvelder, and M. L. Groot, “Single-shot two-dimensional full-range optical coherence tomography achieved by dispersion control,” Opt. Express 17(14), 11335–11349 (2009).
    [Crossref]
  19. T. Butler, S. Slepneva, B. O’Shaughnessy, B. Kelleher, D. Goulding, S. P. Hegarty, H.-C. Lyu, K. Karnowski, M. Wojtkowski, and G. Huyet, “Single shot, time-resolved measurement of the coherence properties of OCT swept source lasers,” Opt. Lett. 40(10), 2277–2280 (2015).
    [Crossref]
  20. P. Koch, G. Hüttmann, H. Schleiermacher, J. Eichholz, and E. Koch, “Linear optical coherence tomography system with a downconverted fringe pattern,” Opt. Lett. 29(14), 1644–1646 (2004).
    [Crossref]
  21. T. U. Nguyen, M. C. Pierce, L. Higgins, and T. S. Tkaczyk, “Snapshot 3D optical coherence tomography system using image mapping spectrometry,” Opt. Express 21(11), 13758–13772 (2013).
    [Crossref]
  22. H. M. Subhash, “Full-field and single-shot full-field optical coherence tomography: a novel technique for biomedical imaging applications,” Adv. Opt. Tech. 2012, 1–26 (2012).
    [Crossref]
  23. I. Zeylikovich, A. Gilerson, and R. R. Alfano, “Nonmechanical grating-generated scanning coherence microscopy,” Opt. Lett. 23(23), 1797–1799 (1998).
    [Crossref]
  24. Y. Watanabe, K. Yamada, and M. Sato, “Three-dimensional imaging by ultrahigh-speed axial-lateral parallel time domain optical coherence tomography,” Opt. Express 14(12), 5201–5209 (2006).
    [Crossref]
  25. A. Agrawal, T. J. Pfefer, P. D. Woolliams, P. H. Tomlins, and G. Nehmetallah, “Methods to assess sensitivity of optical coherence tomography systems,” Biomed. Opt. Express 8(2), 902–917 (2017).
    [Crossref]
  26. I. J. Hsu, C. W. Sun, C. W. Lu, C. C. Yang, C. P. Chiang, and C. W. Lin, “Resolution improvement with dispersion manipulation and a retrieval algorithm in optical coherence tomography,” Appl. Opt. 42(2), 227–234 (2003).
    [Crossref]
  27. I. J. Hsu, C. W. Sun, C. W. Lu, C. C. Yang, C. P. Chiang, and C. W. Lin, “Process algorithms for resolution improvement and contrast enhancement in optical coherence tomography,” Opt. Rev. 10(6), 567–571 (2003).
    [Crossref]
  28. B. Chantarasuwan, P. Baas, B. J. van Heuven, C. Baider, and P. C. van Welzen, “Leaf anatomy of Ficus subsection Urostigma (Moraceae),” Bot. J. Linn. Soc. 175(2), 259–281 (2014).
    [Crossref]
  29. P. Stremplewski, E. Auksorius, P. Wnuk, Ł Kozoń, P. Garstecki, and M. Wojtkowski, “In vivo volumetric imaging by crosstalk-free full-field OCT,” Optica 6(5), 608–617 (2019).
    [Crossref]
  30. B. Karamata, P. Lambelet, M. Laubscher, R. P. Salathé, and T. Lasser, “Spatially incoherent illumination as a mechanism for cross-talk suppression in wide-field optical coherence tomography,” Opt. Lett. 29(7), 736–738 (2004).
    [Crossref]
  31. A. H. Dhalla, J. V. Migacz, and J. A. Izatt, “Crosstalk rejection in parallel optical coherence tomography using spatially incoherent illumination with partially coherent sources,” Opt. Lett. 35(13), 2305–2307 (2010).
    [Crossref]
  32. J. Ogien and A. Dubois, “A compact high-speed full-field optical coherence microscope for high-resolution in vivo skin imaging,” J. Biophotonics 12(2), e201800208 (2019).
    [Crossref]

2019 (2)

P. Stremplewski, E. Auksorius, P. Wnuk, Ł Kozoń, P. Garstecki, and M. Wojtkowski, “In vivo volumetric imaging by crosstalk-free full-field OCT,” Optica 6(5), 608–617 (2019).
[Crossref]

J. Ogien and A. Dubois, “A compact high-speed full-field optical coherence microscope for high-resolution in vivo skin imaging,” J. Biophotonics 12(2), e201800208 (2019).
[Crossref]

2018 (1)

R. Dsouza, J. Won, G. L. Monroy, D. R. Spillman, and S. A. Boppart, “Economical and compact briefcase spectral-domain optical coherence tomography system for primary care and point-of-care applications,” J. Biomed. Opt. 23(09), 1 (2018).
[Crossref]

2017 (2)

2015 (2)

T. Butler, S. Slepneva, B. O’Shaughnessy, B. Kelleher, D. Goulding, S. P. Hegarty, H.-C. Lyu, K. Karnowski, M. Wojtkowski, and G. Huyet, “Single shot, time-resolved measurement of the coherence properties of OCT swept source lasers,” Opt. Lett. 40(10), 2277–2280 (2015).
[Crossref]

N. H. Cho, K. Park, J. Y. Kim, Y. Jung, and J. Kim, “Quantitative assessment of touch-screen panel by nondestructive inspection with three-dimensional real-time display optical coherence tomography,” Opt. Lasers Eng. 68, 50–57 (2015).
[Crossref]

2014 (2)

W. Drexler, M. Liu, A. Kumar, T. Kamali, A. Unterhuber, and R. A. Leitgeb, “Optical coherence tomography today: speed, contrast, and multimodality,” J. Biomed. Opt. 19(7), 071412 (2014).
[Crossref]

B. Chantarasuwan, P. Baas, B. J. van Heuven, C. Baider, and P. C. van Welzen, “Leaf anatomy of Ficus subsection Urostigma (Moraceae),” Bot. J. Linn. Soc. 175(2), 259–281 (2014).
[Crossref]

2013 (1)

2012 (1)

H. M. Subhash, “Full-field and single-shot full-field optical coherence tomography: a novel technique for biomedical imaging applications,” Adv. Opt. Tech. 2012, 1–26 (2012).
[Crossref]

2010 (2)

A. H. Dhalla, J. V. Migacz, and J. A. Izatt, “Crosstalk rejection in parallel optical coherence tomography using spatially incoherent illumination with partially coherent sources,” Opt. Lett. 35(13), 2305–2307 (2010).
[Crossref]

A. Pantelopoulos and N. G. Bourbakis, “A survey on wearable sensor-based systems for health monitoring and prognosis,” IEEE Trans. Syst. Man Cybern. Part C 40(1), 1–12 (2010).
[Crossref]

2009 (1)

2008 (1)

2006 (2)

Y. Yasuno, T. Endo, S. Makita, G. Aoki, M. Itoh, and T. Yatagai, “Three-dimensional line-field Fourier domain optical coherence tomography for in vivo dermatological investigation,” J. Biomed. Opt. 11(1), 014014 (2006).
[Crossref]

Y. Watanabe, K. Yamada, and M. Sato, “Three-dimensional imaging by ultrahigh-speed axial-lateral parallel time domain optical coherence tomography,” Opt. Express 14(12), 5201–5209 (2006).
[Crossref]

2004 (4)

P. Koch, G. Hüttmann, H. Schleiermacher, J. Eichholz, and E. Koch, “Linear optical coherence tomography system with a downconverted fringe pattern,” Opt. Lett. 29(14), 1644–1646 (2004).
[Crossref]

B. Karamata, P. Lambelet, M. Laubscher, R. P. Salathé, and T. Lasser, “Spatially incoherent illumination as a mechanism for cross-talk suppression in wide-field optical coherence tomography,” Opt. Lett. 29(7), 736–738 (2004).
[Crossref]

D. Raskovic, T. Martin, and E. Jovanov, “Medical monitoring applications for wearable computing,” Comput. J. 47(4), 495–504 (2004).
[Crossref]

U. Anliker, J. Beutel, M. Dyer, R. Enzler, P. Lukowicz, L. Thiele, and G. Troster, “A systematic approach to the design of distributed wearable systems,” IEEE Trans. Comput. 53(8), 1017–1033 (2004).
[Crossref]

2003 (5)

1998 (2)

1997 (2)

1995 (1)

A. F. Fercher, C. K. Hitzenberger, G. Kamp, and S. Y. El-Zaiat, “Measurement of intraocular distances by backscattering spectral interferometry,” Opt. Commun. 117(1-2), 43–48 (1995).
[Crossref]

1991 (1)

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, and C. A. Puliafito, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Agrawal, A.

Alfano, R. R.

Anliker, U.

U. Anliker, J. Beutel, M. Dyer, R. Enzler, P. Lukowicz, L. Thiele, and G. Troster, “A systematic approach to the design of distributed wearable systems,” IEEE Trans. Comput. 53(8), 1017–1033 (2004).
[Crossref]

Aoki, G.

Y. Yasuno, T. Endo, S. Makita, G. Aoki, M. Itoh, and T. Yatagai, “Three-dimensional line-field Fourier domain optical coherence tomography for in vivo dermatological investigation,” J. Biomed. Opt. 11(1), 014014 (2006).
[Crossref]

Auksorius, E.

Baas, P.

B. Chantarasuwan, P. Baas, B. J. van Heuven, C. Baider, and P. C. van Welzen, “Leaf anatomy of Ficus subsection Urostigma (Moraceae),” Bot. J. Linn. Soc. 175(2), 259–281 (2014).
[Crossref]

Baclayon, M.

Baider, C.

B. Chantarasuwan, P. Baas, B. J. van Heuven, C. Baider, and P. C. van Welzen, “Leaf anatomy of Ficus subsection Urostigma (Moraceae),” Bot. J. Linn. Soc. 175(2), 259–281 (2014).
[Crossref]

Bajraszewski, T.

Beaurepaire, E.

Beutel, J.

U. Anliker, J. Beutel, M. Dyer, R. Enzler, P. Lukowicz, L. Thiele, and G. Troster, “A systematic approach to the design of distributed wearable systems,” IEEE Trans. Comput. 53(8), 1017–1033 (2004).
[Crossref]

Blanchot, L.

Boccara, A. C.

Boppart, S. A.

R. Dsouza, J. Won, G. L. Monroy, D. R. Spillman, and S. A. Boppart, “Economical and compact briefcase spectral-domain optical coherence tomography system for primary care and point-of-care applications,” J. Biomed. Opt. 23(09), 1 (2018).
[Crossref]

Bourbakis, N. G.

A. Pantelopoulos and N. G. Bourbakis, “A survey on wearable sensor-based systems for health monitoring and prognosis,” IEEE Trans. Syst. Man Cybern. Part C 40(1), 1–12 (2010).
[Crossref]

Butler, T.

Cable, A.

Chang, W.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, and C. A. Puliafito, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Chantarasuwan, B.

B. Chantarasuwan, P. Baas, B. J. van Heuven, C. Baider, and P. C. van Welzen, “Leaf anatomy of Ficus subsection Urostigma (Moraceae),” Bot. J. Linn. Soc. 175(2), 259–281 (2014).
[Crossref]

Chen, Y.

Chiang, C. P.

I. J. Hsu, C. W. Sun, C. W. Lu, C. C. Yang, C. P. Chiang, and C. W. Lin, “Process algorithms for resolution improvement and contrast enhancement in optical coherence tomography,” Opt. Rev. 10(6), 567–571 (2003).
[Crossref]

I. J. Hsu, C. W. Sun, C. W. Lu, C. C. Yang, C. P. Chiang, and C. W. Lin, “Resolution improvement with dispersion manipulation and a retrieval algorithm in optical coherence tomography,” Appl. Opt. 42(2), 227–234 (2003).
[Crossref]

Chinn, S. R.

Cho, N. H.

N. H. Cho, K. Park, J. Y. Kim, Y. Jung, and J. Kim, “Quantitative assessment of touch-screen panel by nondestructive inspection with three-dimensional real-time display optical coherence tomography,” Opt. Lasers Eng. 68, 50–57 (2015).
[Crossref]

Choma, M. A.

Dhalla, A. H.

Drexler, W.

W. Drexler, M. Liu, A. Kumar, T. Kamali, A. Unterhuber, and R. A. Leitgeb, “Optical coherence tomography today: speed, contrast, and multimodality,” J. Biomed. Opt. 19(7), 071412 (2014).
[Crossref]

Dsouza, R.

R. Dsouza, J. Won, G. L. Monroy, D. R. Spillman, and S. A. Boppart, “Economical and compact briefcase spectral-domain optical coherence tomography system for primary care and point-of-care applications,” J. Biomed. Opt. 23(09), 1 (2018).
[Crossref]

Dubois, A.

J. Ogien and A. Dubois, “A compact high-speed full-field optical coherence microscope for high-resolution in vivo skin imaging,” J. Biophotonics 12(2), e201800208 (2019).
[Crossref]

Dunsby, C.

Dyer, M.

U. Anliker, J. Beutel, M. Dyer, R. Enzler, P. Lukowicz, L. Thiele, and G. Troster, “A systematic approach to the design of distributed wearable systems,” IEEE Trans. Comput. 53(8), 1017–1033 (2004).
[Crossref]

Eichholz, J.

El-Zaiat, S. Y.

A. F. Fercher, C. K. Hitzenberger, G. Kamp, and S. Y. El-Zaiat, “Measurement of intraocular distances by backscattering spectral interferometry,” Opt. Commun. 117(1-2), 43–48 (1995).
[Crossref]

Endo, T.

Y. Yasuno, T. Endo, S. Makita, G. Aoki, M. Itoh, and T. Yatagai, “Three-dimensional line-field Fourier domain optical coherence tomography for in vivo dermatological investigation,” J. Biomed. Opt. 11(1), 014014 (2006).
[Crossref]

Enzler, R.

U. Anliker, J. Beutel, M. Dyer, R. Enzler, P. Lukowicz, L. Thiele, and G. Troster, “A systematic approach to the design of distributed wearable systems,” IEEE Trans. Comput. 53(8), 1017–1033 (2004).
[Crossref]

Fercher, A. F.

Flotte, T.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, and C. A. Puliafito, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

French, P. M. W.

Fujimoto, J. G.

Garstecki, P.

Gilerson, A.

Gorczynska, I.

Goulding, D.

Gregory, K.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, and C. A. Puliafito, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Groot, M. L.

Gu, Y.

Hee, M. R.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, and C. A. Puliafito, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Hegarty, S. P.

Higgins, L.

Hitzenberger, C. K.

Hsu, I. J.

I. J. Hsu, C. W. Sun, C. W. Lu, C. C. Yang, C. P. Chiang, and C. W. Lin, “Process algorithms for resolution improvement and contrast enhancement in optical coherence tomography,” Opt. Rev. 10(6), 567–571 (2003).
[Crossref]

I. J. Hsu, C. W. Sun, C. W. Lu, C. C. Yang, C. P. Chiang, and C. W. Lin, “Resolution improvement with dispersion manipulation and a retrieval algorithm in optical coherence tomography,” Appl. Opt. 42(2), 227–234 (2003).
[Crossref]

Huang, D.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, and C. A. Puliafito, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Hüttmann, G.

Huyet, G.

Itoh, M.

Y. Yasuno, T. Endo, S. Makita, G. Aoki, M. Itoh, and T. Yatagai, “Three-dimensional line-field Fourier domain optical coherence tomography for in vivo dermatological investigation,” J. Biomed. Opt. 11(1), 014014 (2006).
[Crossref]

Izatt, J. A.

Jiang, J.

Jovanov, E.

D. Raskovic, T. Martin, and E. Jovanov, “Medical monitoring applications for wearable computing,” Comput. J. 47(4), 495–504 (2004).
[Crossref]

Jung, Y.

N. H. Cho, K. Park, J. Y. Kim, Y. Jung, and J. Kim, “Quantitative assessment of touch-screen panel by nondestructive inspection with three-dimensional real-time display optical coherence tomography,” Opt. Lasers Eng. 68, 50–57 (2015).
[Crossref]

Kamali, T.

W. Drexler, M. Liu, A. Kumar, T. Kamali, A. Unterhuber, and R. A. Leitgeb, “Optical coherence tomography today: speed, contrast, and multimodality,” J. Biomed. Opt. 19(7), 071412 (2014).
[Crossref]

Kamp, G.

A. F. Fercher, C. K. Hitzenberger, G. Kamp, and S. Y. El-Zaiat, “Measurement of intraocular distances by backscattering spectral interferometry,” Opt. Commun. 117(1-2), 43–48 (1995).
[Crossref]

Karamata, B.

Karnowski, K.

Kelleher, B.

Kim, J.

N. H. Cho, K. Park, J. Y. Kim, Y. Jung, and J. Kim, “Quantitative assessment of touch-screen panel by nondestructive inspection with three-dimensional real-time display optical coherence tomography,” Opt. Lasers Eng. 68, 50–57 (2015).
[Crossref]

Kim, J. Y.

N. H. Cho, K. Park, J. Y. Kim, Y. Jung, and J. Kim, “Quantitative assessment of touch-screen panel by nondestructive inspection with three-dimensional real-time display optical coherence tomography,” Opt. Lasers Eng. 68, 50–57 (2015).
[Crossref]

Koch, E.

Koch, P.

Kozon, L

Kulhavy, M.

Kumar, A.

W. Drexler, M. Liu, A. Kumar, T. Kamali, A. Unterhuber, and R. A. Leitgeb, “Optical coherence tomography today: speed, contrast, and multimodality,” J. Biomed. Opt. 19(7), 071412 (2014).
[Crossref]

Lambelet, P.

Lasser, T.

Laubscher, M.

Lebec, M.

Leitgeb, R. A.

W. Drexler, M. Liu, A. Kumar, T. Kamali, A. Unterhuber, and R. A. Leitgeb, “Optical coherence tomography today: speed, contrast, and multimodality,” J. Biomed. Opt. 19(7), 071412 (2014).
[Crossref]

R. A. Leitgeb, C. K. Hitzenberger, A. F. Fercher, and T. Bajraszewski, “Phase-shifting algorithm to achieve high-speed long-depth-range probing by frequency-domain optical coherence tomography,” Opt. Lett. 28(22), 2201–2203 (2003).
[Crossref]

Lexer, F.

Lin, C. P.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, and C. A. Puliafito, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Lin, C. W.

I. J. Hsu, C. W. Sun, C. W. Lu, C. C. Yang, C. P. Chiang, and C. W. Lin, “Process algorithms for resolution improvement and contrast enhancement in optical coherence tomography,” Opt. Rev. 10(6), 567–571 (2003).
[Crossref]

I. J. Hsu, C. W. Sun, C. W. Lu, C. C. Yang, C. P. Chiang, and C. W. Lin, “Resolution improvement with dispersion manipulation and a retrieval algorithm in optical coherence tomography,” Appl. Opt. 42(2), 227–234 (2003).
[Crossref]

Liu, M.

W. Drexler, M. Liu, A. Kumar, T. Kamali, A. Unterhuber, and R. A. Leitgeb, “Optical coherence tomography today: speed, contrast, and multimodality,” J. Biomed. Opt. 19(7), 071412 (2014).
[Crossref]

Lu, C. W.

I. J. Hsu, C. W. Sun, C. W. Lu, C. C. Yang, C. P. Chiang, and C. W. Lin, “Resolution improvement with dispersion manipulation and a retrieval algorithm in optical coherence tomography,” Appl. Opt. 42(2), 227–234 (2003).
[Crossref]

I. J. Hsu, C. W. Sun, C. W. Lu, C. C. Yang, C. P. Chiang, and C. W. Lin, “Process algorithms for resolution improvement and contrast enhancement in optical coherence tomography,” Opt. Rev. 10(6), 567–571 (2003).
[Crossref]

Lukowicz, P.

U. Anliker, J. Beutel, M. Dyer, R. Enzler, P. Lukowicz, L. Thiele, and G. Troster, “A systematic approach to the design of distributed wearable systems,” IEEE Trans. Comput. 53(8), 1017–1033 (2004).
[Crossref]

Lyu, H.-C.

Ma, L.

Makita, S.

Y. Yasuno, T. Endo, S. Makita, G. Aoki, M. Itoh, and T. Yatagai, “Three-dimensional line-field Fourier domain optical coherence tomography for in vivo dermatological investigation,” J. Biomed. Opt. 11(1), 014014 (2006).
[Crossref]

Mansvelder, H. D.

Martin, T.

D. Raskovic, T. Martin, and E. Jovanov, “Medical monitoring applications for wearable computing,” Comput. J. 47(4), 495–504 (2004).
[Crossref]

Migacz, J. V.

Monroy, G. L.

R. Dsouza, J. Won, G. L. Monroy, D. R. Spillman, and S. A. Boppart, “Economical and compact briefcase spectral-domain optical coherence tomography system for primary care and point-of-care applications,” J. Biomed. Opt. 23(09), 1 (2018).
[Crossref]

Nehmetallah, G.

Nguyen, T. U.

O’Shaughnessy, B.

Ogien, J.

J. Ogien and A. Dubois, “A compact high-speed full-field optical coherence microscope for high-resolution in vivo skin imaging,” J. Biophotonics 12(2), e201800208 (2019).
[Crossref]

Pantelopoulos, A.

A. Pantelopoulos and N. G. Bourbakis, “A survey on wearable sensor-based systems for health monitoring and prognosis,” IEEE Trans. Syst. Man Cybern. Part C 40(1), 1–12 (2010).
[Crossref]

Park, K.

N. H. Cho, K. Park, J. Y. Kim, Y. Jung, and J. Kim, “Quantitative assessment of touch-screen panel by nondestructive inspection with three-dimensional real-time display optical coherence tomography,” Opt. Lasers Eng. 68, 50–57 (2015).
[Crossref]

Peterman, E. J.

Pfefer, T. J.

Pierce, M. C.

Potsaid, B.

Puliafito, C. A.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, and C. A. Puliafito, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Raskovic, D.

D. Raskovic, T. Martin, and E. Jovanov, “Medical monitoring applications for wearable computing,” Comput. J. 47(4), 495–504 (2004).
[Crossref]

Saint-Jalmes, H.

Salathé, R. P.

Sarunic, M. V.

Sato, M.

Schleiermacher, H.

Schuman, J. S.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, and C. A. Puliafito, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Slepneva, S.

Spillman, D. R.

R. Dsouza, J. Won, G. L. Monroy, D. R. Spillman, and S. A. Boppart, “Economical and compact briefcase spectral-domain optical coherence tomography system for primary care and point-of-care applications,” J. Biomed. Opt. 23(09), 1 (2018).
[Crossref]

Srinivasan, V. J.

Stinson, W. G.

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, and C. A. Puliafito, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Stremplewski, P.

Subhash, H. M.

H. M. Subhash, “Full-field and single-shot full-field optical coherence tomography: a novel technique for biomedical imaging applications,” Adv. Opt. Tech. 2012, 1–26 (2012).
[Crossref]

Sun, C. W.

I. J. Hsu, C. W. Sun, C. W. Lu, C. C. Yang, C. P. Chiang, and C. W. Lin, “Resolution improvement with dispersion manipulation and a retrieval algorithm in optical coherence tomography,” Appl. Opt. 42(2), 227–234 (2003).
[Crossref]

I. J. Hsu, C. W. Sun, C. W. Lu, C. C. Yang, C. P. Chiang, and C. W. Lin, “Process algorithms for resolution improvement and contrast enhancement in optical coherence tomography,” Opt. Rev. 10(6), 567–571 (2003).
[Crossref]

Swanson, E. A.

S. R. Chinn, E. A. Swanson, and J. G. Fujimoto, “Optical coherence tomography using a frequency-tunable optical source,” Opt. Lett. 22(5), 340–342 (1997).
[Crossref]

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, and C. A. Puliafito, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Thiele, L.

U. Anliker, J. Beutel, M. Dyer, R. Enzler, P. Lukowicz, L. Thiele, and G. Troster, “A systematic approach to the design of distributed wearable systems,” IEEE Trans. Comput. 53(8), 1017–1033 (2004).
[Crossref]

Tkaczyk, T. S.

Tomlins, P. H.

Toonen, R. F.

Troster, G.

U. Anliker, J. Beutel, M. Dyer, R. Enzler, P. Lukowicz, L. Thiele, and G. Troster, “A systematic approach to the design of distributed wearable systems,” IEEE Trans. Comput. 53(8), 1017–1033 (2004).
[Crossref]

Unterhuber, A.

W. Drexler, M. Liu, A. Kumar, T. Kamali, A. Unterhuber, and R. A. Leitgeb, “Optical coherence tomography today: speed, contrast, and multimodality,” J. Biomed. Opt. 19(7), 071412 (2014).
[Crossref]

van Heuven, B. J.

B. Chantarasuwan, P. Baas, B. J. van Heuven, C. Baider, and P. C. van Welzen, “Leaf anatomy of Ficus subsection Urostigma (Moraceae),” Bot. J. Linn. Soc. 175(2), 259–281 (2014).
[Crossref]

van Welzen, P. C.

B. Chantarasuwan, P. Baas, B. J. van Heuven, C. Baider, and P. C. van Welzen, “Leaf anatomy of Ficus subsection Urostigma (Moraceae),” Bot. J. Linn. Soc. 175(2), 259–281 (2014).
[Crossref]

Watanabe, Y.

Witte, S.

Wnuk, P.

Wojtkowski, M.

Won, J.

R. Dsouza, J. Won, G. L. Monroy, D. R. Spillman, and S. A. Boppart, “Economical and compact briefcase spectral-domain optical coherence tomography system for primary care and point-of-care applications,” J. Biomed. Opt. 23(09), 1 (2018).
[Crossref]

Woolliams, P. D.

Yamada, K.

Yang, C.

Yang, C. C.

I. J. Hsu, C. W. Sun, C. W. Lu, C. C. Yang, C. P. Chiang, and C. W. Lin, “Resolution improvement with dispersion manipulation and a retrieval algorithm in optical coherence tomography,” Appl. Opt. 42(2), 227–234 (2003).
[Crossref]

I. J. Hsu, C. W. Sun, C. W. Lu, C. C. Yang, C. P. Chiang, and C. W. Lin, “Process algorithms for resolution improvement and contrast enhancement in optical coherence tomography,” Opt. Rev. 10(6), 567–571 (2003).
[Crossref]

Yasuno, Y.

Y. Yasuno, T. Endo, S. Makita, G. Aoki, M. Itoh, and T. Yatagai, “Three-dimensional line-field Fourier domain optical coherence tomography for in vivo dermatological investigation,” J. Biomed. Opt. 11(1), 014014 (2006).
[Crossref]

Yatagai, T.

Y. Yasuno, T. Endo, S. Makita, G. Aoki, M. Itoh, and T. Yatagai, “Three-dimensional line-field Fourier domain optical coherence tomography for in vivo dermatological investigation,” J. Biomed. Opt. 11(1), 014014 (2006).
[Crossref]

Yu, P.

Zeylikovich, I.

Zhang, M.

Adv. Opt. Tech. (1)

H. M. Subhash, “Full-field and single-shot full-field optical coherence tomography: a novel technique for biomedical imaging applications,” Adv. Opt. Tech. 2012, 1–26 (2012).
[Crossref]

Appl. Opt. (2)

Biomed. Opt. Express (1)

Bot. J. Linn. Soc. (1)

B. Chantarasuwan, P. Baas, B. J. van Heuven, C. Baider, and P. C. van Welzen, “Leaf anatomy of Ficus subsection Urostigma (Moraceae),” Bot. J. Linn. Soc. 175(2), 259–281 (2014).
[Crossref]

Comput. J. (1)

D. Raskovic, T. Martin, and E. Jovanov, “Medical monitoring applications for wearable computing,” Comput. J. 47(4), 495–504 (2004).
[Crossref]

IEEE Trans. Comput. (1)

U. Anliker, J. Beutel, M. Dyer, R. Enzler, P. Lukowicz, L. Thiele, and G. Troster, “A systematic approach to the design of distributed wearable systems,” IEEE Trans. Comput. 53(8), 1017–1033 (2004).
[Crossref]

IEEE Trans. Syst. Man Cybern. Part C (1)

A. Pantelopoulos and N. G. Bourbakis, “A survey on wearable sensor-based systems for health monitoring and prognosis,” IEEE Trans. Syst. Man Cybern. Part C 40(1), 1–12 (2010).
[Crossref]

J. Biomed. Opt. (3)

Y. Yasuno, T. Endo, S. Makita, G. Aoki, M. Itoh, and T. Yatagai, “Three-dimensional line-field Fourier domain optical coherence tomography for in vivo dermatological investigation,” J. Biomed. Opt. 11(1), 014014 (2006).
[Crossref]

W. Drexler, M. Liu, A. Kumar, T. Kamali, A. Unterhuber, and R. A. Leitgeb, “Optical coherence tomography today: speed, contrast, and multimodality,” J. Biomed. Opt. 19(7), 071412 (2014).
[Crossref]

R. Dsouza, J. Won, G. L. Monroy, D. R. Spillman, and S. A. Boppart, “Economical and compact briefcase spectral-domain optical coherence tomography system for primary care and point-of-care applications,” J. Biomed. Opt. 23(09), 1 (2018).
[Crossref]

J. Biophotonics (1)

J. Ogien and A. Dubois, “A compact high-speed full-field optical coherence microscope for high-resolution in vivo skin imaging,” J. Biophotonics 12(2), e201800208 (2019).
[Crossref]

Opt. Commun. (1)

A. F. Fercher, C. K. Hitzenberger, G. Kamp, and S. Y. El-Zaiat, “Measurement of intraocular distances by backscattering spectral interferometry,” Opt. Commun. 117(1-2), 43–48 (1995).
[Crossref]

Opt. Express (6)

Opt. Lasers Eng. (1)

N. H. Cho, K. Park, J. Y. Kim, Y. Jung, and J. Kim, “Quantitative assessment of touch-screen panel by nondestructive inspection with three-dimensional real-time display optical coherence tomography,” Opt. Lasers Eng. 68, 50–57 (2015).
[Crossref]

Opt. Lett. (9)

S. R. Chinn, E. A. Swanson, and J. G. Fujimoto, “Optical coherence tomography using a frequency-tunable optical source,” Opt. Lett. 22(5), 340–342 (1997).
[Crossref]

T. Butler, S. Slepneva, B. O’Shaughnessy, B. Kelleher, D. Goulding, S. P. Hegarty, H.-C. Lyu, K. Karnowski, M. Wojtkowski, and G. Huyet, “Single shot, time-resolved measurement of the coherence properties of OCT swept source lasers,” Opt. Lett. 40(10), 2277–2280 (2015).
[Crossref]

P. Koch, G. Hüttmann, H. Schleiermacher, J. Eichholz, and E. Koch, “Linear optical coherence tomography system with a downconverted fringe pattern,” Opt. Lett. 29(14), 1644–1646 (2004).
[Crossref]

R. A. Leitgeb, C. K. Hitzenberger, A. F. Fercher, and T. Bajraszewski, “Phase-shifting algorithm to achieve high-speed long-depth-range probing by frequency-domain optical coherence tomography,” Opt. Lett. 28(22), 2201–2203 (2003).
[Crossref]

M. Zhang, L. Ma, and P. Yu, “Spatial convolution for mirror image suppression in Fourier domain optical coherence tomography,” Opt. Lett. 42(3), 506–509 (2017).
[Crossref]

I. Zeylikovich, A. Gilerson, and R. R. Alfano, “Nonmechanical grating-generated scanning coherence microscopy,” Opt. Lett. 23(23), 1797–1799 (1998).
[Crossref]

E. Beaurepaire, A. C. Boccara, M. Lebec, L. Blanchot, and H. Saint-Jalmes, “Full-field optical coherence microscopy,” Opt. Lett. 23(4), 244–246 (1998).
[Crossref]

B. Karamata, P. Lambelet, M. Laubscher, R. P. Salathé, and T. Lasser, “Spatially incoherent illumination as a mechanism for cross-talk suppression in wide-field optical coherence tomography,” Opt. Lett. 29(7), 736–738 (2004).
[Crossref]

A. H. Dhalla, J. V. Migacz, and J. A. Izatt, “Crosstalk rejection in parallel optical coherence tomography using spatially incoherent illumination with partially coherent sources,” Opt. Lett. 35(13), 2305–2307 (2010).
[Crossref]

Opt. Rev. (1)

I. J. Hsu, C. W. Sun, C. W. Lu, C. C. Yang, C. P. Chiang, and C. W. Lin, “Process algorithms for resolution improvement and contrast enhancement in optical coherence tomography,” Opt. Rev. 10(6), 567–571 (2003).
[Crossref]

Optica (1)

Science (1)

D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, and C. A. Puliafito, “Optical coherence tomography,” Science 254(5035), 1178–1181 (1991).
[Crossref]

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (11)

Fig. 1.
Fig. 1. The (a) layout and (b) experimental setup of the spatial-domain OCT system. L, light source; NDF1, NDF2 and NDF3, neutral-density filters; BE, beam expander; BS, beamsplitter; BB, beam block; DC, dispersion compensator; CM, cylindrical mirror; CL, cylindrical lens; SS, sample stage; CCD, CCD camera.
Fig. 2.
Fig. 2. (a) Configuration for calculation of the relation between the optical delay of the reference beam and its incident position on the cylindrical mirror. CM, cylindrical mirror; CCD, CCD camera. (b) The relation between the optical delay of the reference beam and its incident position on the CCD camera as well as the horizontal pixel number of the CCD camera when a cylindrical mirror with different radii of curvature is used.
Fig. 3.
Fig. 3. (a) IO, (b) I1, (c) I2 and (d) IB measured by the CCD camera where a reflection mirror was used as the sample. (e-h) The corresponding one dimensional intensity distribution of the portion indicated with the dashed lines in (a)-(d).
Fig. 4.
Fig. 4. (a) OCT image of a reflection mirror after image processing. (b) The one-dimensional intensity distribution indicated with the dashed line in (a).
Fig. 5.
Fig. 5. (a) The relation between resulted SNR and the number of measurements. (b) The average of ten OCT images of a reflection mirror after image processing. (c) The one-dimensional intensity distribution indicated with the dashed line in (b).
Fig. 6.
Fig. 6. Merged image of a series of OCT images of a reflection mirror at different axial positions before (a) and after (c) calibration of nonlinearity. The relation between axial depth in a sample and the horizontal pixel number of CCD camera before (b) and after (d) calibration of nonlinearity.
Fig. 7.
Fig. 7. (a) One-dimensional intensity distributions of the OCT images of a reflection mirror at different axial positions. (b) SNR and (c) axial resolution for a sample at different axial depths.
Fig. 8.
Fig. 8. (a) Image of the 1951 USAF resolution test target. (b) Image of the marked region of Group 0 when the reference beam was blocked. (c-d) Images of Elements 3 and 4 of Group 5, respectively, where a plano-concave cylindrical lens was placed in front of the CCD camera to enlarge the image in the lateral direction.
Fig. 9.
Fig. 9. The relation between the magnification, resolution and the imaging range in the lateral direction when a plano-concave cylindrical lens placed in front of the CCD camera.
Fig. 10.
Fig. 10. (a) OCT image and (b) microscopic image of a leaf of Ficus subpisocarpa. CP, cuticle proper; EC, epidermal cells; VB, vascular bundle; SP, spongy parenchyma. The scale bars correspond to 100 µm.
Fig. 11.
Fig. 11. (a) OCT image of a human fingertip. ED, epidermis; SD, sweat duct. (b) OCT image of a human fingertip when the focal plane of the cylindrical lens was moved deeper inside the sample. DE, dermis. The scale bars correspond to 100 µm.

Equations (8)

Equations on this page are rendered with MathJax. Learn more.

Optical delay = C D + D E a = 2 [( R 2 d 2 )( R R 2 d 2 ) + a d 2 ] R 2 2 d 2 ,
A E = d + ( A B + C D ) tan 2 θ = 2 d ( a + R ) R 2 d 2 R 2 d R 2 2 d 2 .
I O = I S + I R + 2 Re [ E S E R ] + I B ,
I 1 = I R + I B .
I 2 = I S + I B .
I = | I O I 1 I 2 + I B | .
S N R = 20 log ( I s a m p σ b g ) = 20 log ( 1904 11.47 ) = 44.40  dB .
S N R = 20 log ( I s a m p σ b g ) = 20 log ( 1828 6.75 ) = 48.65  dB .

Metrics