Abstract

In this work, Fourier integral microscope (FIMic), an ultimate design of 3D-integral microscopy, is presented. By placing a multiplexing microlens array at the aperture stop of the microscope objective of the host microscope, FIMic shows extended depth of field and enhanced lateral resolution in comparison with regular integral microscopy. As FIMic directly produces a set of orthographic views of the 3D-micrometer-sized sample, it is suitable for real-time imaging. Following regular integral-imaging reconstruction algorithms, a 2.75-fold enhanced depth of field and 2-time better spatial resolution in comparison with conventional integral microscopy is reported. Our claims are supported by theoretical analysis and experimental images of a resolution test target, cotton fibers, and in-vivo 3D-imaging of biological specimens.

© 2017 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

In 1908 G. Lippmann proposed the integral photography (IP) based on the use of a microlens array (MLA) [1]. This architecture permitted to record a collection of elemental images (EIs) of 3D scenes, each with different perspective information. This pioneering design was done with the aim of displaying auto-stereoscopic 3D images through an IP monitor. Today, this aim is still a hot topic of research, and many innovative proposals have been presented in the recent years [2–5]. The plenoptic camera was proposed for the capture of multi-perspective 2D images of 3D far objects. This camera is basically the result of inserting a MLA at the image plane of a conventional camera, and displacing the sensor to the MLA focal plane [6–10]. The camera was designed for capturing, after a single shot, the spatial-angular information of 3D far scenes. This information is used to compute orthographic views and 3D depth maps.

The fast development of opto-electronic technology, the increasing capacity and speed of software tools, and the improvement of the quality of manufactured microlenses, has led the plenoptic technology to be highly profitable for many innovative applications. Among them, some biomedical applications are remarkable, like the use of plenoptic technology in otoscopy [11], ophthalmology [12], endoscopy [13-14] or for deep-tissue inspection [15-16]. Plenoptic technology has been proposed also for wavefront sensing [17], 3D imaging using long-wave infrared light [18], or head-mounted display technology [19].

A natural application of the IP concept is therefore 3D microscopy. First, Jang and Javidi, recognized the utility of IP for displaying 3D microscopic images [20]. But it is just after the Stanford group’s contributions that integral microscopy (IMic), also named as plenoptic or lightfield microscopy, was considered as an interesting alternative technique for real 3D imaging of micrometer-sized objects [21-22]. Therefore, the most wanted feature that IMic offers to microscopists is the possibility of observing the samples, almost in real time, from many different perspectives.

The above-mentioned possibilities for IMic are unfortunately hinder by its poor spatial resolution and limited depth of field (DOF). Along the past few years many techniques have been proposed for overcoming these flaws, some based on time or space multiplexing [23–25] some other based on computational interpolation or deconvolution [26-27]. An additional approach makes use of a two-stage relay for enlarging the aperture plane of the microscope to fit the size of the imaging lens array [28]. The latter is composed by 25 CCTV lenses that are placed close to the enlarged aperture plane to record view images, which are thereafter processed. However, all these mainly computationally driven solutions have an associated trade-off of enormous data processing concealing, for instance, real time applications.

As a proposal to overcome those limitations of 3D Integral microscopy, our group has proposed recently Fourier Integral Microscopy (FIMic) [25]. In that proof-of-concept proposal a regular photographic camera scanned manually a conjugate plane of the aperture stop. The time-multiplexed recorded view-images provided the 3D information of the sample. This concept has been used very recently for proposing a new instrument for the rapid inspection of neural activities [29]. However, that instrument does not take full profit from FIMic concept as the attained resolution and the DOF are far from being optimized.

In this work the FIMic concept is boosted to produce a totally optically driven solution to the flaws of IMic, and attain the best combination of resolution and DOF. The main outcomes achieved here, as compared with the previous reports of FIMic, are the following. First, theoretical formulae that relate the lateral resolution and the DOF with experimental parameters are reported. Second, based on those formulae ad hoc configurations for optimized resolution and DOF can be easily designed. Third, now the MLA is placed at the aperture stop (AS) of the microscope objective (MO) to capture directly the plenoptic far field in a single-shot mode. Fourth, the new realization turns FIMic into a very compact instrument. In summary, the new approach reported here minimizes the time and data consumption, enlarges significantly the DOF, and optimizes the resolving power; allowing the capturing, in real time with a compact instrument, of 3D images of microscopic live specimen.

2. Parameters design

The core of the innovative solution to the main drawbacks of IMic is to place the MLA at the AS of the MO. Since in this configuration a Fourier or far-field elemental image (EI) is recorded by each microlens, we have coined this architecture as FIMic [25]. In this arrangement, the plenoptic field is directly recorded by means of the sampling that the MLA provides of the spatio-angular information at the far field located at the AS. In Fig. 1 a schematic layout of FIMic is shown. As for regular commercial MOs the AS may not be mechanically accessible, a relay system is necessary to conjugate the MLA with the AS. In our case, the relay is composed by two converging lenses, RL1 and RL2, with focal lengths f1 and f2, arranged to form a telecentric relay setup that conjugates (xF,yF) and (xF,yF) planes. The digital sensor, a CCD camera for instance, is set at the rear focal plane of the MLA where magnified images of the field stop (FS) are recorded. The FS, placed at the common focal plane for RL1 and RL2, is chosen such that the EIs are tangent at the recording plane to optimize the use of the digital sensor. According to the proposed setup, the size of the FS (ϕFS) has to be ϕFS=ϕEI f2/fL, where ϕFS equals the size of each elemental image. Equivalently, as the FS is seen in the object space, it determines the field of view (FOV) of the FIMic, such that FOV=ϕFS fMO/f1. The optimum use of the available area of the digital sensor is achieved when ϕEI  equals the pitch, p, of the MLA, hence the FOV of FIMic is given by

FOV=2fMOf2f1NAMLA,
where NAMLA=p/2fL is the numerical aperture of the MLA.

 

Fig. 1 Schematic layout of Fourier Integral Microscopy (FIMic). A collection of orthographic views (or EIs) is directly obtained. ROP is the reference object plane; RL1 and RL2 are the relay lenses, and CCD is the digital sensor.

Download Full Size | PPT Slide | PDF

The underlying concept in FIMic is the spatial-angular sampling at the AS of the MO of the host microscope. The latter is the optical microscope in which the MLA is inserted. As the performance of the host microscope is mainly determined by the diameter of AS (ϕAS), any intervention in AS will therefore modify the overall performance of the microscope. Such sampling, defined here along one space direction but extendable to the both orthogonal, of AS can be therefore quantified in terms of the number of microlenses, N=ϕAS/p¯ (where p¯=pf1/f2), that are fitted within. In other words, N is the number of samples that are taken along the diameter of the AS, i.e FIMic provides N elemental images. Hence FIMic can be understood as a new microscope with an effective aperture stop ϕAS/N. As for the trivial case N = 1, FIMic provides only a single EI with exactly the same features of the host microscope. As N grows, the number of samples increases raising the number of EIs at the cost of reducing the size of the effective AS. Therefore, this shrinkage brings along a reduction of the effective numerical aperture up to NAH/N, being NAH the numerical aperture of the host microscope. This fact implies a reduction of the spatial resolution and an increase of the depth of field (DOF). This feature of FIMic invites to consider the existing trade-off between the DOF and the spatial resolution in terms of the number of EIs.

With the above ideas in mind and having set up the FIMic for optimized use of the digital sensor, one can consider that the complete width of the sensor is covered by the N elemental images. This means that each EI is recorded by T/N pixels with T being the total number of pixels of the sensor. The physical width of the sensor , with δ the pixel size, must match the size of the imaged AS, namely Tδ=ΦASf2/f1, hence the pixel size of the digital sensor must fulfil the condition

δ=NTpf2f1
for the optimized use of the digital sensor.

The spatial resolution of FIMic is twofold determined. Firstly, according to wave-optics theory, two point objects separated a distance ρ have to be resolved by each of the EIs that sample the AS, namely ρNλ/2NAH, with NAH=ϕAS/2fMO the numerical aperture of the host MO. Secondly, according to Nyquist, the distance between the two points should be large enough to be recorded by different pixels leaving at least an empty pixel in between. Therefore, ρ2δf2fMO/fLf1. The combination of these two factors leads to an overall resolution limit for FIMic given by

ρEImax{Nλ2NAH,2δf2fMOfLf1}.
If the pixel size is selected smartly, so that the two terms have the same value, we obtain

ρEIλ2NAHN.

As for the DOF, we simply need to adapt the classical formulae to the effective NA [30] of the FIMic, that is

DOFEI=λN2NAH2+δNNAHf2fMOfLf1.

Assuming now the same value for the pixel size as above,

DOFEI=54λNAH2N2.

Before going to the next section, and aiming to provide the analytical tools for the comparison between IMic and FIMic we present the formulae for the resolution and the DOF in IMic. For the understanding of these equations, it is necessary to take into account that in IMic the resolution and the DOF are determined by the MLA pitch, which plays here the same role as the pixel size in FIMic. The lateral resolution of views computed from IMic microimages is then given by [21]

ρView{λ2NAH,2pMH}=μλNAH.

In this equation MH is the magnification of the host microscope and μ is a parameter defined here as the quotient between p and the wave-optics resolution limit as evaluated in the image plane. The DOF of computed views is given by

DOFView=λNAH2+μpMHNAH=(1+μ22)λNAH2.

From Eqs. (4), (6), (7), and (8) we obtain

ρEI=N2μρViewandDOFEI=5N24+2μ2DOFView.

From these equations it is apparent that given an IMic, it is always possible to design FIMic with the same resolution but much better DOF, or with the same DOF but much better resolution, or even with simultaneous improvement of resolution and DOF. This superiority of FIMic is obtained at the prize of obtaining a lower number of orthographic views. Note however that this lost is not critical, provided that the number of EIs captured with FIMic is enough for the calculation of refocused images or depth maps.

3. Comparative performance analysis

This section is devoted to contrast the more relevant features of IMic and FIMic. To this end, a conventional IMic with in-focus lateral resolution of 6.2 µm and 80 µm DOF was built initially. Then, to compare the performance of IMic to FIMic, two experimental set-ups of the FIMic were realized. The superior performance of the FIMic over the IMic has been validated under two conditions to ensure fair comparison (i) DOF performance assessment with the FIMic built to have the same spatial resolution as the IMic, (ii) Spatial resolution performance assessment with the FIMic designed to have the same DOF as the IMic. Further comments on the number of views, refocusing capabilities and data processing are also presented.

3.1 Both microscopes operating at similar spatial resolution to test their DOF

Using an MLA composed of microlenses with fL=6.5 mm, p = 1.0 mm, and NAMLA=0.077 (APH-Q-P1000-R2.95 manufactured by AMUS), and an infinity corrected MO (fMO=9.0 mm and NA = 0.4) a FIMic microscope was built. Since the AS was not mechanically accessible, we determined by optical means the position of the exit pupil and also its diameter,  ϕAS=7.1 mm. A proper choice of two achromatic doublet lenses with f1=50 mm and f2=50 mm were used as relay system to specifically match the MLA and the AS such that the resolution equals that of the IMic built in our lab. This configuration fits N=7.1 microlenses in the AS. As digital sensor we have utilized a CMOS camera (EO-5012c ½”) with 2560x1920 square pixels of δ=2.2 μm in side. According to Eq. (3) and (5), this setup was expected to give a lateral resolution limit of 6.1 µm and a DOF of 240 µm.

In order to evaluate the DOF we used as the object an USAF resolution chart, which was placed first at the reference object plane (ROP) (see Fig. 1) and later displaced axially in regular steps up to Z=110 μm. For all these axial positions of the chart we captured directly the views with the FIMic and the microimages with the IMic. In the latter case, the views were computed from the microimages. Next, in Fig. 2 we show the central views. Note that we show only the view for distances moving away in the Z>0 direction, but it is seen the same for the negative direction. For both microscopes the smallest element that is correctly resolved in the ROP is the element 3 of group 7, which corresponds to a resolution limit of 6.20 µm. As assuming the DOF as twice the distance from the ROP to that plane where the resolution has decreased by a factor 1/2, the DOF extends up to where the element 6 of group 6, 8.76 µm, is resolved. From the views shown in Fig. 2 we find a DOF for IMic of 80 µm whereas it is of 220 µm for FIMic. An improvement of 2.75-times in the DOF is achieved, showing the superior performance of FIMic in terms of the DOF as both microscopes are setup for having the overall same spatial resolution.

 

Fig. 2 Central view provided by the two microscopes when operating both at resolution limit of about 6.20 μm. The label Z indicates the distance from the ROP. Providing both microscopes the same resolution at the ROP, the DOF extends up to Z=+110 μm for the FIMic, but only up to Z=+40 μm for the IMic.

Download Full Size | PPT Slide | PDF

3.2 Both microscopes operating at similar DOF to compare their spatial resolution

In order to implement a FIMic with the same DOF as our regular integral IMic, we have used the same MLA as in section 3.1, but used new MO (fMO=10.0 mm, NA=0.5, and  ϕAS=10.0 mm) an relay (f1=200 mm and f2=100 mm). This setup renders to a FIMic with N = 5.0 which provides an expected resolution for the central view of 3.4 µm and the sought DOF of 77 µm. The central views for both microscopes are shown in Fig. 3.

 

Fig. 3 Central view provided by both microscopes when operating with the same DOF. The FIMic provides better resolution at the edge of its DOF than the IMic at its ROP.

Download Full Size | PPT Slide | PDF

From Fig. 3 it is apparent that along the complete DOF the FIMic provides a spatial resolution of the order 3.9 µm, which is superior in comparison with the 6.2 µm provided by the IMic. This superior performance of FIMic can be foreseen from the fact that for achieving the DOF provided by our IMic, the FIMic has to be set up with just N = 5.0¸ what renders to a reduction of the resolution of the host microscope in just the same figure. The reader should be also aware that FIMic has quite better spatial lateral resolution at the edges of its DOF than IMic at its ROP. This feature can be added to the wish list the boosted FIMic is offering at the time of doing 3D integral microscopy.

3.3 Elemental images

The hallmark in integral microscopy is with no doubt its possibility of recording views of 3D samples to produce a-posteriori refocusing and/or 3D-rendering. In this sense, the direct recording of elemental images by FIMic is a notable difference between the IMic and FIMic.

In the case of IMic, the elemental images are computed from the recorded microimages. The nth elemental image is obtained by composing, in an orderly manner, the nth pixels of each microimage. Consequently, from IMic one can compute as many elemental images as pixels form each microimage. The number of the pixels in each microimage is determined by the quotient T/p. In Fig. 4, panel (a) there is a typical IMic recording of an USAF chart (see the tangent microimages in the zoomed-in area). In panel (b), the set of computed EIs is shown.

 

Fig. 4 Microimages and EIs for IMic. Panel (a) shows the microimages recorded directly by the digital sensor. Panel (b) is the complete set of EIs computed from the microimages. In this experiment each micro image was formed by 17x17 pixels, hence the same number of EIs is computed. Zoomed-in areas are magnified by factor of four.

Download Full Size | PPT Slide | PDF

A notable feature of FIMic is its capacity of recording directly the EIs. This simplifies the use of the microscope, for instance in real-time applications as shown some paragraphs below. Additionally to real time applications, the direct recording of the elemental images allows the user to make decisions on the achieved resolution, the focusing of the recorded images, and the shot scene. In Fig. 5 is shown a direct recording of FIMic corresponding to N = 3.

 

Fig. 5 Elemental images directly recorded in FIMic. The available area of the digital sensor is utilized for the recording of the N=δTf1/pf2 elemental images.

Download Full Size | PPT Slide | PDF

3.4 Refocusing capabilities and data processing

One of the most important, thrilling and desired capabilities of integral microscopy is its ability of making a-posteriori refocusing through the 3D scene. Once the EIs are captured/calculated, the major difference between IMic and FIMic, in terms of the refocusing capabilities, lays on number of planes (and the distance between them) in which the scene can be refocused. In the case of applying, for instance, the refocusing algorithm of pixel shifting and summing [31] in FIMic, the refocusing distances from the ROP are given by

ZR=nfMO2fL(f2f1)2δp,
with n being the (positive or negative) number of pixels that the EIs are shifted before summed. Indeed, on considering that one pixel is the smallest amount to be shifted, the step between the refocused images is ΔZR=(fMO2f22δ)/(fLf12p). Because in FIMic the EIs have regularly a larger number of pixels than in IMic, the step between the refocusing planes is much smaller, and therefore the number of refocused planes is much bigger for a given DOF. This feature provides the FIMic with the capacity of refocusing with finer detail than IMic.

The refocusing step, ΔZR, could be made finer in IMic by upsizing, computationally, the elemental images. This strategy allows non-integer numbers of shifted pixels, but has two essential drawbacks. One is that, due to the fact that the upsizing does not add any additional information to the EIs, there is a limit beyond which raising the number of refocused planes does not produce differentiated refocused images. The other drawback is that an extreme upsizing can lead to surpass the computational capabilities of the computing system.

Another point that should be stressed is that in IMic the resolution of refocused images is not homogeneous [32]. In short, there are planes with worse resolution (the ones corresponding to an integer number of shifted pixels), and planes with better resolution (non-integer number of shifted pixels). However, as in FIMic any EI is composed by a large number of pixels, it has the desirable feature on maintaining the same spatial resolution for all the refocused images along the DOF.

In Fig. 6, some of the above-mentioned features of the refocusing capabilities of FIMic v.s. IMic are illustrated. This figure shows cotton fibers imaged with both microscopes operating at same DOF. In top row, refocused images for IMic are presented. The images for ZR=0 μm is calculated directly from the EIs, with no need of upsizing. This is a plane corresponding to an integer value of n, and therefore with worse resolution (see pixelation in the zoomed-in area). However, the computation of the images for ZR=11 μm and for ZR=21 μm was possible only after upsizing the EIs by factor six. Despite of the important increase of the data to be processed, there is not too much information that can be retrieved from the additional planes refocused after the upsizing of the EIs.

 

Fig. 6 Refocusing capabilities FIMic v.s. IMic. FIMic provides better resolution along the entire DOF.

Download Full Size | PPT Slide | PDF

In the bottom row the refocused images for FIMic are shown. Here the directly-recorded EIs have 454 pixels, hence it is possible to directly refocus 454 planes. Among such number of planes we have limited to show only those for the same reconstruction distances utilized for IMic. In a clear difference with the refocused images for IMic, those for FIMic show greater detail for the different planes than in IMic, that is derived from the much better spatial resolution that FIMic has when it operates at the same DOF than IMic. Furthermore, the invariant spatial resolution of FIMic can be observed for the different refocused planes, which contrast with the strong changes of the spatial resolution of IMic as for the different focused planes.

4. In-vivo imaging of a biological sample

Beyond the experiments to compare the performance of FIMic with a regular IMic, we present further examples of use of FIMic to show some of the possible applications of this ultimate design of integral microscopy as for imaging in-vivo biological specimens.

The simplicity and robustness of FIMic design carries along no sample preparation for in-vivo imaging. Algae were collected from seashore and preserved in a regular container. We poured some few algae with sea water over a glass slide and then both were covered with a regular cover slide. The sample included not only the algae, but also a marine nematode swimming through them. The sample was placed in the sample holder of the FIMic used in Section 3.2. With this setup we recorded a 45s video (4 fps) comprising seven different perspectives of the live sample. The complete video, but speeded up to 12 fps, is shown in Visualization 1. A single frame of it is shown in Fig. 7. It is remarkable that the FIMic microscope holds good enough resolution to image internal details of the nematode

 

Fig. 7 Frame extracted from Visualization 1, a video of a marine nematode swimming through algae. Panel (a) shows the directly recorded EIs. In panel (b) there is a 3-times zoomed-in of the central EI. The red arrow points the nematode.

Download Full Size | PPT Slide | PDF

Using as frames the static EIs shown Fig. 7(a), we composed a video, Visualization 2, to see the perspectives rendered by the FIMic. Additionally to the regular movement of the specimen, the 3D-image provided by the stack of perspectives would definitely add valuable information for those trained in the study of this particular organism.

The final feature of FIMic we want to show is its capability of a-posteriori refocusing from the directly recorded EIs. From the EIs of the frame 315 of Visualization 1, we have computed 40 refocusing planes. Visualization 3 shows the different planes where different sections of the scene are clearly refocused. In this video the refocusing distance ZR is measured in μm.

5. Summary

We have reported on Fourier Integral Imaging Microscopy (FIMic), a single-shot, single-camera, 3D integral microscope that allows real-time imaging of the studied sample. FIMic exceeds 2.75-time the depth of field and 2-time the spatial resolution of regular integral microscopes (IMic). The enhanced performance of the proposed FIMic microscope relies on multiplexing the spatial-angular information located at the aperture stop of the microscope objective. By placing the microlenses array at the said stop, FIMic records a set of orthoscopic views, whose reconstructed images of the 3D-micrometer-sized sample exhibit enhanced depth of field and better spatial resolution than IMic. The performance of these two integral microscopes have been contrasted to support the reported claims.

The enhanced performance of FIMic, the simplicity of 3D image reconstruction along with its more compact and solid setup, suit FIMic as a very competitive tool for the production of real-time imaging of 3D samples of the micrometer-sized world. Cotton fiber and in-vivo marine nematode have been imaged with FIMic to show real world applications of its enhanced performance on 3D microscopy.

Funding

Spanish Ministry of the Economy and Competitiveness (DPI2015-66458-C2-1R); GVA, Spain (PROMETEOII/2014/072).

Acknowledgments

Some authors acknowledge their personal funding: E. Sanchez-Ortiga (APOSTD/ 2015/094); J. Sola-Pikabea (ACIF/2016/296); G. Scrofani (MSCA grant 676401); and A. Llavador (UV-INV-PREDOC13-110484). J. Garcia-Sucerquia acknowledges the Universidad Nacional de Colombia for the Hermes grant 35765, and also to the University of Valencia for a Visiting Professor fellowship.

Disclosures

The authors declare that there are no conflicts of interest related to this article.

References and links

1. G. Lippmann, “Epreuves reversibles donnant la sensation du relief,” J. Phys. Theor. Appl. 7(1), 821–825 (1908). [CrossRef]  

2. J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010). [CrossRef]  

3. T. Shimobaba, Y. Sato, J. Miura, M. Takenouchi, and T. Ito, “Real-time digital holographic microscopy using the graphic processing unit,” Opt. Express 16(16), 11776–11781 (2008). [CrossRef]   [PubMed]  

4. J. Wang, X. Xiao, H. Hua, and B. Javidi, “Augmented reality 3D displays with micro integral imaging,” J. Disp. Technol. 11(11), 889–893 (2015). [CrossRef]  

5. A. Dorado, M. Martinez-Corral, G. Saavedra, and S. Hong, “Computation and display of 3D movie from a single integral photography,” J. Disp. Technol. 12(7), 695–700 (2016). [CrossRef]  

6. N. Davies, M. McCormick, and L. Yang, “Three-dimensional imaging systems: a new development,” Appl. Opt. 27(21), 4520–4528 (1988). [CrossRef]   [PubMed]  

7. E. H. Adelson and J. Y. A. Wang, “Single lens stereo with a plenoptic camera,” IEEE Trans. Pattern Anal. Mach. Intell. 14(2), 99–106 (1992). [CrossRef]  

8. R. Ng, “Digital Light Field Photography,” Stanford University (2006).

9. T. Georgiev and A. Lumsdaine, “Focused plenoptic camera and rendering,” J. Electron. Imaging 19(2), 021106 (2010). [CrossRef]  

10. C. Perwass and L. Wietzke, “Single lens 3D-camera with extended depth-of-field,” Proc. SPIE 8921, 892108 (2012).

11. N. Bedard, T. Shope, A. Hoberman, M. A. Haralam, N. Shaikh, J. Kovačević, N. Balram, and I. Tošić, “Light field otoscope design for 3D in vivo imaging of the middle ear,” Biomed. Opt. Express 8(1), 260–272 (2016). [CrossRef]   [PubMed]  

12. H. Chen, V. Sick, M. Woodward, and D. Burke, “Human Iris 3D Imaging using a micro-Plenoptic Camera,” in Bio-Optics: Design and Application (Optical Society of America, 2017), p. BoW3A–6.

13. J. Liu, D. Claus, T. Xu, T. Keßner, A. Herkommer, and W. Osten, “Light field endoscopy and its parametric description,” Opt. Lett. 42(9), 1804–1807 (2017). [CrossRef]   [PubMed]  

14. H. N. D. Le, R. Decker, A. Krieger, and J. U. Kang, “Experimental assessment of a 3-D plenoptic endoscopic imaging system,” Chin. Opt. Lett. 15, 051701 (2017). [CrossRef]  

15. R. S. Decker, A. Shademan, J. D. Opfermann, S. Leonard, P. C. W. Kim, and A. Krieger, “Biocompatible Near-Infrared Three-Dimensional Tracking System,” IEEE Trans. Biomed. Eng. 64(3), 549–556 (2017). [PubMed]  

16. N. C. Pégard, H.-Y. Liu, N. Antipa, M. Gerlock, H. Adesnik, and L. Waller, “Compressive light-field microscopy for 3D neural activity recording,” Optica 3(5), 517–524 (2016). [CrossRef]  

17. Y. Lv, H. Ma, Q. Sun, P. Ma, Y. Ning, and X. Xu, “Wavefront Sensing Based on Partially Occluded and Extended Scene Target,” IEEE Photonics J. 9, 1–8 (2017).

18. S. Komatsu, A. Markman, A. Mahalanobis, K. Chen, and B. Javidi, “Three-dimensional integral imaging and object detection using long-wave infrared imaging,” Appl. Opt. 56(9), D120–D126 (2017). [CrossRef]   [PubMed]  

19. H. Hua and B. Javidi, “A 3D integral imaging optical see-through head-mounted display,” Opt. Express 22(11), 13484–13491 (2014). [CrossRef]   [PubMed]  

20. J.-S. Jang and B. Javidi, “Three-dimensional integral imaging of micro-objects,” Opt. Lett. 29(11), 1230–1232 (2004). [CrossRef]   [PubMed]  

21. M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25(3), 924–934 (2006). [CrossRef]  

22. M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microsc. 235(2), 144–162 (2009). [CrossRef]   [PubMed]  

23. Y.-T. Lim, J.-H. Park, K.-C. Kwon, and N. Kim, “Resolution-enhanced integral imaging microscopy that uses lens array shifting,” Opt. Express 17(21), 19253–19263 (2009). [CrossRef]   [PubMed]  

24. A. Llavador, E. Sánchez-Ortiga, J. C. Barreiro, G. Saavedra, and M. Martínez-Corral, “Resolution enhancement in integral microscopy by physical interpolation,” Biomed. Opt. Express 6(8), 2854–2863 (2015). [CrossRef]   [PubMed]  

25. A. Llavador, J. Sola-Pikabea, G. Saavedra, B. Javidi, and M. Martínez-Corral, “Resolution improvements in integral microscopy with Fourier plane recording,” Opt. Express 24(18), 20792–20798 (2016). [CrossRef]   [PubMed]  

26. K.-C. Kwon, J.-S. Jeong, M.-U. Erdenebat, Y.-L. Piao, K.-H. Yoo, and N. Kim, “Resolution-enhancement for an orthographic-view image display in an integral imaging microscope system,” Biomed. Opt. Express 6(3), 736–746 (2015). [CrossRef]   [PubMed]  

27. M. Broxton, L. Grosenick, S. Yang, N. Cohen, A. Andalman, K. Deisseroth, and M. Levoy, “Wave optics theory and 3-D deconvolution for the light field microscope,” Opt. Express 21(21), 25418–25439 (2013). [CrossRef]   [PubMed]  

28. X. Lin, J. Wu, G. Zheng, and Q. Dai, “Camera array based light field microscopy,” Biomed. Opt. Express 6(9), 3179–3189 (2015). [CrossRef]   [PubMed]  

29. L. Cong, Z. Wang, Y. Chai, W. Hang, C. Shang, W. Yang, L. Bai, J. Du, K. Wang, and Q. Wen, “Rapid whole brain imaging of neural activity in freely behaving larval zebrafish (Danio rerio),” eLife 6, e28158 (2017). [CrossRef]   [PubMed]  

30. M. Pluta, Advanced Light Microscopy. Principles and Basic Properties (Elsevier, 1988).

31. S.-H. Hong, J.-S. Jang, and B. Javidi, “Three-dimensional volumetric object reconstruction using computational integral imaging,” Opt. Express 12(3), 483–491 (2004). [CrossRef]   [PubMed]  

32. H. Navarro, E. Sánchez-Ortiga, G. Saavedra, A. Llavador, A. Dorado, M. Martínez-Corral, and B. Javidi, “Non-homogeneity of lateral resolution in integral imaging,” J. Disp. Technol. 9(1), 37–43 (2013). [CrossRef]  

References

  • View by:
  • |
  • |
  • |

  1. G. Lippmann, “Epreuves reversibles donnant la sensation du relief,” J. Phys. Theor. Appl. 7(1), 821–825 (1908).
    [Crossref]
  2. J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
    [Crossref]
  3. T. Shimobaba, Y. Sato, J. Miura, M. Takenouchi, and T. Ito, “Real-time digital holographic microscopy using the graphic processing unit,” Opt. Express 16(16), 11776–11781 (2008).
    [Crossref] [PubMed]
  4. J. Wang, X. Xiao, H. Hua, and B. Javidi, “Augmented reality 3D displays with micro integral imaging,” J. Disp. Technol. 11(11), 889–893 (2015).
    [Crossref]
  5. A. Dorado, M. Martinez-Corral, G. Saavedra, and S. Hong, “Computation and display of 3D movie from a single integral photography,” J. Disp. Technol. 12(7), 695–700 (2016).
    [Crossref]
  6. N. Davies, M. McCormick, and L. Yang, “Three-dimensional imaging systems: a new development,” Appl. Opt. 27(21), 4520–4528 (1988).
    [Crossref] [PubMed]
  7. E. H. Adelson and J. Y. A. Wang, “Single lens stereo with a plenoptic camera,” IEEE Trans. Pattern Anal. Mach. Intell. 14(2), 99–106 (1992).
    [Crossref]
  8. R. Ng, “Digital Light Field Photography,” Stanford University (2006).
  9. T. Georgiev and A. Lumsdaine, “Focused plenoptic camera and rendering,” J. Electron. Imaging 19(2), 021106 (2010).
    [Crossref]
  10. C. Perwass and L. Wietzke, “Single lens 3D-camera with extended depth-of-field,” Proc. SPIE 8921, 892108 (2012).
  11. N. Bedard, T. Shope, A. Hoberman, M. A. Haralam, N. Shaikh, J. Kovačević, N. Balram, and I. Tošić, “Light field otoscope design for 3D in vivo imaging of the middle ear,” Biomed. Opt. Express 8(1), 260–272 (2016).
    [Crossref] [PubMed]
  12. H. Chen, V. Sick, M. Woodward, and D. Burke, “Human Iris 3D Imaging using a micro-Plenoptic Camera,” in Bio-Optics: Design and Application (Optical Society of America, 2017), p. BoW3A–6.
  13. J. Liu, D. Claus, T. Xu, T. Keßner, A. Herkommer, and W. Osten, “Light field endoscopy and its parametric description,” Opt. Lett. 42(9), 1804–1807 (2017).
    [Crossref] [PubMed]
  14. H. N. D. Le, R. Decker, A. Krieger, and J. U. Kang, “Experimental assessment of a 3-D plenoptic endoscopic imaging system,” Chin. Opt. Lett. 15, 051701 (2017).
    [Crossref]
  15. R. S. Decker, A. Shademan, J. D. Opfermann, S. Leonard, P. C. W. Kim, and A. Krieger, “Biocompatible Near-Infrared Three-Dimensional Tracking System,” IEEE Trans. Biomed. Eng. 64(3), 549–556 (2017).
    [PubMed]
  16. N. C. Pégard, H.-Y. Liu, N. Antipa, M. Gerlock, H. Adesnik, and L. Waller, “Compressive light-field microscopy for 3D neural activity recording,” Optica 3(5), 517–524 (2016).
    [Crossref]
  17. Y. Lv, H. Ma, Q. Sun, P. Ma, Y. Ning, and X. Xu, “Wavefront Sensing Based on Partially Occluded and Extended Scene Target,” IEEE Photonics J. 9, 1–8 (2017).
  18. S. Komatsu, A. Markman, A. Mahalanobis, K. Chen, and B. Javidi, “Three-dimensional integral imaging and object detection using long-wave infrared imaging,” Appl. Opt. 56(9), D120–D126 (2017).
    [Crossref] [PubMed]
  19. H. Hua and B. Javidi, “A 3D integral imaging optical see-through head-mounted display,” Opt. Express 22(11), 13484–13491 (2014).
    [Crossref] [PubMed]
  20. J.-S. Jang and B. Javidi, “Three-dimensional integral imaging of micro-objects,” Opt. Lett. 29(11), 1230–1232 (2004).
    [Crossref] [PubMed]
  21. M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25(3), 924–934 (2006).
    [Crossref]
  22. M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microsc. 235(2), 144–162 (2009).
    [Crossref] [PubMed]
  23. Y.-T. Lim, J.-H. Park, K.-C. Kwon, and N. Kim, “Resolution-enhanced integral imaging microscopy that uses lens array shifting,” Opt. Express 17(21), 19253–19263 (2009).
    [Crossref] [PubMed]
  24. A. Llavador, E. Sánchez-Ortiga, J. C. Barreiro, G. Saavedra, and M. Martínez-Corral, “Resolution enhancement in integral microscopy by physical interpolation,” Biomed. Opt. Express 6(8), 2854–2863 (2015).
    [Crossref] [PubMed]
  25. A. Llavador, J. Sola-Pikabea, G. Saavedra, B. Javidi, and M. Martínez-Corral, “Resolution improvements in integral microscopy with Fourier plane recording,” Opt. Express 24(18), 20792–20798 (2016).
    [Crossref] [PubMed]
  26. K.-C. Kwon, J.-S. Jeong, M.-U. Erdenebat, Y.-L. Piao, K.-H. Yoo, and N. Kim, “Resolution-enhancement for an orthographic-view image display in an integral imaging microscope system,” Biomed. Opt. Express 6(3), 736–746 (2015).
    [Crossref] [PubMed]
  27. M. Broxton, L. Grosenick, S. Yang, N. Cohen, A. Andalman, K. Deisseroth, and M. Levoy, “Wave optics theory and 3-D deconvolution for the light field microscope,” Opt. Express 21(21), 25418–25439 (2013).
    [Crossref] [PubMed]
  28. X. Lin, J. Wu, G. Zheng, and Q. Dai, “Camera array based light field microscopy,” Biomed. Opt. Express 6(9), 3179–3189 (2015).
    [Crossref] [PubMed]
  29. L. Cong, Z. Wang, Y. Chai, W. Hang, C. Shang, W. Yang, L. Bai, J. Du, K. Wang, and Q. Wen, “Rapid whole brain imaging of neural activity in freely behaving larval zebrafish (Danio rerio),” eLife 6, e28158 (2017).
    [Crossref] [PubMed]
  30. M. Pluta, Advanced Light Microscopy. Principles and Basic Properties (Elsevier, 1988).
  31. S.-H. Hong, J.-S. Jang, and B. Javidi, “Three-dimensional volumetric object reconstruction using computational integral imaging,” Opt. Express 12(3), 483–491 (2004).
    [Crossref] [PubMed]
  32. H. Navarro, E. Sánchez-Ortiga, G. Saavedra, A. Llavador, A. Dorado, M. Martínez-Corral, and B. Javidi, “Non-homogeneity of lateral resolution in integral imaging,” J. Disp. Technol. 9(1), 37–43 (2013).
    [Crossref]

2017 (6)

J. Liu, D. Claus, T. Xu, T. Keßner, A. Herkommer, and W. Osten, “Light field endoscopy and its parametric description,” Opt. Lett. 42(9), 1804–1807 (2017).
[Crossref] [PubMed]

H. N. D. Le, R. Decker, A. Krieger, and J. U. Kang, “Experimental assessment of a 3-D plenoptic endoscopic imaging system,” Chin. Opt. Lett. 15, 051701 (2017).
[Crossref]

R. S. Decker, A. Shademan, J. D. Opfermann, S. Leonard, P. C. W. Kim, and A. Krieger, “Biocompatible Near-Infrared Three-Dimensional Tracking System,” IEEE Trans. Biomed. Eng. 64(3), 549–556 (2017).
[PubMed]

Y. Lv, H. Ma, Q. Sun, P. Ma, Y. Ning, and X. Xu, “Wavefront Sensing Based on Partially Occluded and Extended Scene Target,” IEEE Photonics J. 9, 1–8 (2017).

S. Komatsu, A. Markman, A. Mahalanobis, K. Chen, and B. Javidi, “Three-dimensional integral imaging and object detection using long-wave infrared imaging,” Appl. Opt. 56(9), D120–D126 (2017).
[Crossref] [PubMed]

L. Cong, Z. Wang, Y. Chai, W. Hang, C. Shang, W. Yang, L. Bai, J. Du, K. Wang, and Q. Wen, “Rapid whole brain imaging of neural activity in freely behaving larval zebrafish (Danio rerio),” eLife 6, e28158 (2017).
[Crossref] [PubMed]

2016 (4)

2015 (4)

2014 (1)

2013 (2)

H. Navarro, E. Sánchez-Ortiga, G. Saavedra, A. Llavador, A. Dorado, M. Martínez-Corral, and B. Javidi, “Non-homogeneity of lateral resolution in integral imaging,” J. Disp. Technol. 9(1), 37–43 (2013).
[Crossref]

M. Broxton, L. Grosenick, S. Yang, N. Cohen, A. Andalman, K. Deisseroth, and M. Levoy, “Wave optics theory and 3-D deconvolution for the light field microscope,” Opt. Express 21(21), 25418–25439 (2013).
[Crossref] [PubMed]

2012 (1)

C. Perwass and L. Wietzke, “Single lens 3D-camera with extended depth-of-field,” Proc. SPIE 8921, 892108 (2012).

2010 (2)

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

T. Georgiev and A. Lumsdaine, “Focused plenoptic camera and rendering,” J. Electron. Imaging 19(2), 021106 (2010).
[Crossref]

2009 (2)

M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microsc. 235(2), 144–162 (2009).
[Crossref] [PubMed]

Y.-T. Lim, J.-H. Park, K.-C. Kwon, and N. Kim, “Resolution-enhanced integral imaging microscopy that uses lens array shifting,” Opt. Express 17(21), 19253–19263 (2009).
[Crossref] [PubMed]

2008 (1)

2006 (1)

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25(3), 924–934 (2006).
[Crossref]

2004 (2)

1992 (1)

E. H. Adelson and J. Y. A. Wang, “Single lens stereo with a plenoptic camera,” IEEE Trans. Pattern Anal. Mach. Intell. 14(2), 99–106 (1992).
[Crossref]

1988 (1)

1908 (1)

G. Lippmann, “Epreuves reversibles donnant la sensation du relief,” J. Phys. Theor. Appl. 7(1), 821–825 (1908).
[Crossref]

Adams, A.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25(3), 924–934 (2006).
[Crossref]

Adelson, E. H.

E. H. Adelson and J. Y. A. Wang, “Single lens stereo with a plenoptic camera,” IEEE Trans. Pattern Anal. Mach. Intell. 14(2), 99–106 (1992).
[Crossref]

Adesnik, H.

Andalman, A.

Antipa, N.

Arai, J.

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

Bai, L.

L. Cong, Z. Wang, Y. Chai, W. Hang, C. Shang, W. Yang, L. Bai, J. Du, K. Wang, and Q. Wen, “Rapid whole brain imaging of neural activity in freely behaving larval zebrafish (Danio rerio),” eLife 6, e28158 (2017).
[Crossref] [PubMed]

Balram, N.

Barreiro, J. C.

Bedard, N.

Broxton, M.

Chai, Y.

L. Cong, Z. Wang, Y. Chai, W. Hang, C. Shang, W. Yang, L. Bai, J. Du, K. Wang, and Q. Wen, “Rapid whole brain imaging of neural activity in freely behaving larval zebrafish (Danio rerio),” eLife 6, e28158 (2017).
[Crossref] [PubMed]

Chen, K.

Claus, D.

Cohen, N.

Cong, L.

L. Cong, Z. Wang, Y. Chai, W. Hang, C. Shang, W. Yang, L. Bai, J. Du, K. Wang, and Q. Wen, “Rapid whole brain imaging of neural activity in freely behaving larval zebrafish (Danio rerio),” eLife 6, e28158 (2017).
[Crossref] [PubMed]

Dai, Q.

Davies, N.

Decker, R.

Decker, R. S.

R. S. Decker, A. Shademan, J. D. Opfermann, S. Leonard, P. C. W. Kim, and A. Krieger, “Biocompatible Near-Infrared Three-Dimensional Tracking System,” IEEE Trans. Biomed. Eng. 64(3), 549–556 (2017).
[PubMed]

Deisseroth, K.

Dorado, A.

A. Dorado, M. Martinez-Corral, G. Saavedra, and S. Hong, “Computation and display of 3D movie from a single integral photography,” J. Disp. Technol. 12(7), 695–700 (2016).
[Crossref]

H. Navarro, E. Sánchez-Ortiga, G. Saavedra, A. Llavador, A. Dorado, M. Martínez-Corral, and B. Javidi, “Non-homogeneity of lateral resolution in integral imaging,” J. Disp. Technol. 9(1), 37–43 (2013).
[Crossref]

Du, J.

L. Cong, Z. Wang, Y. Chai, W. Hang, C. Shang, W. Yang, L. Bai, J. Du, K. Wang, and Q. Wen, “Rapid whole brain imaging of neural activity in freely behaving larval zebrafish (Danio rerio),” eLife 6, e28158 (2017).
[Crossref] [PubMed]

Erdenebat, M.-U.

Footer, M.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25(3), 924–934 (2006).
[Crossref]

Furuya, M.

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

Georgiev, T.

T. Georgiev and A. Lumsdaine, “Focused plenoptic camera and rendering,” J. Electron. Imaging 19(2), 021106 (2010).
[Crossref]

Gerlock, M.

Grosenick, L.

Haino, Y.

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

Hang, W.

L. Cong, Z. Wang, Y. Chai, W. Hang, C. Shang, W. Yang, L. Bai, J. Du, K. Wang, and Q. Wen, “Rapid whole brain imaging of neural activity in freely behaving larval zebrafish (Danio rerio),” eLife 6, e28158 (2017).
[Crossref] [PubMed]

Haralam, M. A.

Herkommer, A.

Hoberman, A.

Hong, S.

A. Dorado, M. Martinez-Corral, G. Saavedra, and S. Hong, “Computation and display of 3D movie from a single integral photography,” J. Disp. Technol. 12(7), 695–700 (2016).
[Crossref]

Hong, S.-H.

Horowitz, M.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25(3), 924–934 (2006).
[Crossref]

Hua, H.

J. Wang, X. Xiao, H. Hua, and B. Javidi, “Augmented reality 3D displays with micro integral imaging,” J. Disp. Technol. 11(11), 889–893 (2015).
[Crossref]

H. Hua and B. Javidi, “A 3D integral imaging optical see-through head-mounted display,” Opt. Express 22(11), 13484–13491 (2014).
[Crossref] [PubMed]

Ito, T.

Jang, J.-S.

Javidi, B.

Jeong, J.-S.

Kang, J. U.

Kawakita, M.

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

Keßner, T.

Kim, N.

Kim, P. C. W.

R. S. Decker, A. Shademan, J. D. Opfermann, S. Leonard, P. C. W. Kim, and A. Krieger, “Biocompatible Near-Infrared Three-Dimensional Tracking System,” IEEE Trans. Biomed. Eng. 64(3), 549–556 (2017).
[PubMed]

Komatsu, S.

Kovacevic, J.

Krieger, A.

H. N. D. Le, R. Decker, A. Krieger, and J. U. Kang, “Experimental assessment of a 3-D plenoptic endoscopic imaging system,” Chin. Opt. Lett. 15, 051701 (2017).
[Crossref]

R. S. Decker, A. Shademan, J. D. Opfermann, S. Leonard, P. C. W. Kim, and A. Krieger, “Biocompatible Near-Infrared Three-Dimensional Tracking System,” IEEE Trans. Biomed. Eng. 64(3), 549–556 (2017).
[PubMed]

Kwon, K.-C.

Le, H. N. D.

Leonard, S.

R. S. Decker, A. Shademan, J. D. Opfermann, S. Leonard, P. C. W. Kim, and A. Krieger, “Biocompatible Near-Infrared Three-Dimensional Tracking System,” IEEE Trans. Biomed. Eng. 64(3), 549–556 (2017).
[PubMed]

Levoy, M.

M. Broxton, L. Grosenick, S. Yang, N. Cohen, A. Andalman, K. Deisseroth, and M. Levoy, “Wave optics theory and 3-D deconvolution for the light field microscope,” Opt. Express 21(21), 25418–25439 (2013).
[Crossref] [PubMed]

M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microsc. 235(2), 144–162 (2009).
[Crossref] [PubMed]

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25(3), 924–934 (2006).
[Crossref]

Lim, Y.-T.

Lin, X.

Lippmann, G.

G. Lippmann, “Epreuves reversibles donnant la sensation du relief,” J. Phys. Theor. Appl. 7(1), 821–825 (1908).
[Crossref]

Liu, H.-Y.

Liu, J.

Llavador, A.

Lumsdaine, A.

T. Georgiev and A. Lumsdaine, “Focused plenoptic camera and rendering,” J. Electron. Imaging 19(2), 021106 (2010).
[Crossref]

Lv, Y.

Y. Lv, H. Ma, Q. Sun, P. Ma, Y. Ning, and X. Xu, “Wavefront Sensing Based on Partially Occluded and Extended Scene Target,” IEEE Photonics J. 9, 1–8 (2017).

Ma, H.

Y. Lv, H. Ma, Q. Sun, P. Ma, Y. Ning, and X. Xu, “Wavefront Sensing Based on Partially Occluded and Extended Scene Target,” IEEE Photonics J. 9, 1–8 (2017).

Ma, P.

Y. Lv, H. Ma, Q. Sun, P. Ma, Y. Ning, and X. Xu, “Wavefront Sensing Based on Partially Occluded and Extended Scene Target,” IEEE Photonics J. 9, 1–8 (2017).

Mahalanobis, A.

Markman, A.

Martinez-Corral, M.

A. Dorado, M. Martinez-Corral, G. Saavedra, and S. Hong, “Computation and display of 3D movie from a single integral photography,” J. Disp. Technol. 12(7), 695–700 (2016).
[Crossref]

Martínez-Corral, M.

McCormick, M.

McDowall, I.

M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microsc. 235(2), 144–162 (2009).
[Crossref] [PubMed]

Miura, J.

Navarro, H.

H. Navarro, E. Sánchez-Ortiga, G. Saavedra, A. Llavador, A. Dorado, M. Martínez-Corral, and B. Javidi, “Non-homogeneity of lateral resolution in integral imaging,” J. Disp. Technol. 9(1), 37–43 (2013).
[Crossref]

Ng, R.

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25(3), 924–934 (2006).
[Crossref]

Ning, Y.

Y. Lv, H. Ma, Q. Sun, P. Ma, Y. Ning, and X. Xu, “Wavefront Sensing Based on Partially Occluded and Extended Scene Target,” IEEE Photonics J. 9, 1–8 (2017).

Okano, F.

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

Okui, M.

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

Opfermann, J. D.

R. S. Decker, A. Shademan, J. D. Opfermann, S. Leonard, P. C. W. Kim, and A. Krieger, “Biocompatible Near-Infrared Three-Dimensional Tracking System,” IEEE Trans. Biomed. Eng. 64(3), 549–556 (2017).
[PubMed]

Osten, W.

Park, J.-H.

Pégard, N. C.

Perwass, C.

C. Perwass and L. Wietzke, “Single lens 3D-camera with extended depth-of-field,” Proc. SPIE 8921, 892108 (2012).

Piao, Y.-L.

Saavedra, G.

A. Llavador, J. Sola-Pikabea, G. Saavedra, B. Javidi, and M. Martínez-Corral, “Resolution improvements in integral microscopy with Fourier plane recording,” Opt. Express 24(18), 20792–20798 (2016).
[Crossref] [PubMed]

A. Dorado, M. Martinez-Corral, G. Saavedra, and S. Hong, “Computation and display of 3D movie from a single integral photography,” J. Disp. Technol. 12(7), 695–700 (2016).
[Crossref]

A. Llavador, E. Sánchez-Ortiga, J. C. Barreiro, G. Saavedra, and M. Martínez-Corral, “Resolution enhancement in integral microscopy by physical interpolation,” Biomed. Opt. Express 6(8), 2854–2863 (2015).
[Crossref] [PubMed]

H. Navarro, E. Sánchez-Ortiga, G. Saavedra, A. Llavador, A. Dorado, M. Martínez-Corral, and B. Javidi, “Non-homogeneity of lateral resolution in integral imaging,” J. Disp. Technol. 9(1), 37–43 (2013).
[Crossref]

Sánchez-Ortiga, E.

A. Llavador, E. Sánchez-Ortiga, J. C. Barreiro, G. Saavedra, and M. Martínez-Corral, “Resolution enhancement in integral microscopy by physical interpolation,” Biomed. Opt. Express 6(8), 2854–2863 (2015).
[Crossref] [PubMed]

H. Navarro, E. Sánchez-Ortiga, G. Saavedra, A. Llavador, A. Dorado, M. Martínez-Corral, and B. Javidi, “Non-homogeneity of lateral resolution in integral imaging,” J. Disp. Technol. 9(1), 37–43 (2013).
[Crossref]

Sato, M.

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

Sato, Y.

Shademan, A.

R. S. Decker, A. Shademan, J. D. Opfermann, S. Leonard, P. C. W. Kim, and A. Krieger, “Biocompatible Near-Infrared Three-Dimensional Tracking System,” IEEE Trans. Biomed. Eng. 64(3), 549–556 (2017).
[PubMed]

Shaikh, N.

Shang, C.

L. Cong, Z. Wang, Y. Chai, W. Hang, C. Shang, W. Yang, L. Bai, J. Du, K. Wang, and Q. Wen, “Rapid whole brain imaging of neural activity in freely behaving larval zebrafish (Danio rerio),” eLife 6, e28158 (2017).
[Crossref] [PubMed]

Shimobaba, T.

Shope, T.

Sola-Pikabea, J.

Sun, Q.

Y. Lv, H. Ma, Q. Sun, P. Ma, Y. Ning, and X. Xu, “Wavefront Sensing Based on Partially Occluded and Extended Scene Target,” IEEE Photonics J. 9, 1–8 (2017).

Takenouchi, M.

Tošic, I.

Waller, L.

Wang, J.

J. Wang, X. Xiao, H. Hua, and B. Javidi, “Augmented reality 3D displays with micro integral imaging,” J. Disp. Technol. 11(11), 889–893 (2015).
[Crossref]

Wang, J. Y. A.

E. H. Adelson and J. Y. A. Wang, “Single lens stereo with a plenoptic camera,” IEEE Trans. Pattern Anal. Mach. Intell. 14(2), 99–106 (1992).
[Crossref]

Wang, K.

L. Cong, Z. Wang, Y. Chai, W. Hang, C. Shang, W. Yang, L. Bai, J. Du, K. Wang, and Q. Wen, “Rapid whole brain imaging of neural activity in freely behaving larval zebrafish (Danio rerio),” eLife 6, e28158 (2017).
[Crossref] [PubMed]

Wang, Z.

L. Cong, Z. Wang, Y. Chai, W. Hang, C. Shang, W. Yang, L. Bai, J. Du, K. Wang, and Q. Wen, “Rapid whole brain imaging of neural activity in freely behaving larval zebrafish (Danio rerio),” eLife 6, e28158 (2017).
[Crossref] [PubMed]

Wen, Q.

L. Cong, Z. Wang, Y. Chai, W. Hang, C. Shang, W. Yang, L. Bai, J. Du, K. Wang, and Q. Wen, “Rapid whole brain imaging of neural activity in freely behaving larval zebrafish (Danio rerio),” eLife 6, e28158 (2017).
[Crossref] [PubMed]

Wietzke, L.

C. Perwass and L. Wietzke, “Single lens 3D-camera with extended depth-of-field,” Proc. SPIE 8921, 892108 (2012).

Wu, J.

Xiao, X.

J. Wang, X. Xiao, H. Hua, and B. Javidi, “Augmented reality 3D displays with micro integral imaging,” J. Disp. Technol. 11(11), 889–893 (2015).
[Crossref]

Xu, T.

Xu, X.

Y. Lv, H. Ma, Q. Sun, P. Ma, Y. Ning, and X. Xu, “Wavefront Sensing Based on Partially Occluded and Extended Scene Target,” IEEE Photonics J. 9, 1–8 (2017).

Yang, L.

Yang, S.

Yang, W.

L. Cong, Z. Wang, Y. Chai, W. Hang, C. Shang, W. Yang, L. Bai, J. Du, K. Wang, and Q. Wen, “Rapid whole brain imaging of neural activity in freely behaving larval zebrafish (Danio rerio),” eLife 6, e28158 (2017).
[Crossref] [PubMed]

Yoo, K.-H.

Yoshimura, M.

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

Zhang, Z.

M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microsc. 235(2), 144–162 (2009).
[Crossref] [PubMed]

Zheng, G.

ACM Trans. Graph. (1)

M. Levoy, R. Ng, A. Adams, M. Footer, and M. Horowitz, “Light field microscopy,” ACM Trans. Graph. 25(3), 924–934 (2006).
[Crossref]

Appl. Opt. (2)

Biomed. Opt. Express (4)

Chin. Opt. Lett. (1)

eLife (1)

L. Cong, Z. Wang, Y. Chai, W. Hang, C. Shang, W. Yang, L. Bai, J. Du, K. Wang, and Q. Wen, “Rapid whole brain imaging of neural activity in freely behaving larval zebrafish (Danio rerio),” eLife 6, e28158 (2017).
[Crossref] [PubMed]

IEEE Photonics J. (1)

Y. Lv, H. Ma, Q. Sun, P. Ma, Y. Ning, and X. Xu, “Wavefront Sensing Based on Partially Occluded and Extended Scene Target,” IEEE Photonics J. 9, 1–8 (2017).

IEEE Trans. Biomed. Eng. (1)

R. S. Decker, A. Shademan, J. D. Opfermann, S. Leonard, P. C. W. Kim, and A. Krieger, “Biocompatible Near-Infrared Three-Dimensional Tracking System,” IEEE Trans. Biomed. Eng. 64(3), 549–556 (2017).
[PubMed]

IEEE Trans. Pattern Anal. Mach. Intell. (1)

E. H. Adelson and J. Y. A. Wang, “Single lens stereo with a plenoptic camera,” IEEE Trans. Pattern Anal. Mach. Intell. 14(2), 99–106 (1992).
[Crossref]

J. Disp. Technol. (4)

J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010).
[Crossref]

J. Wang, X. Xiao, H. Hua, and B. Javidi, “Augmented reality 3D displays with micro integral imaging,” J. Disp. Technol. 11(11), 889–893 (2015).
[Crossref]

A. Dorado, M. Martinez-Corral, G. Saavedra, and S. Hong, “Computation and display of 3D movie from a single integral photography,” J. Disp. Technol. 12(7), 695–700 (2016).
[Crossref]

H. Navarro, E. Sánchez-Ortiga, G. Saavedra, A. Llavador, A. Dorado, M. Martínez-Corral, and B. Javidi, “Non-homogeneity of lateral resolution in integral imaging,” J. Disp. Technol. 9(1), 37–43 (2013).
[Crossref]

J. Electron. Imaging (1)

T. Georgiev and A. Lumsdaine, “Focused plenoptic camera and rendering,” J. Electron. Imaging 19(2), 021106 (2010).
[Crossref]

J. Microsc. (1)

M. Levoy, Z. Zhang, and I. McDowall, “Recording and controlling the 4D light field in a microscope using microlens arrays,” J. Microsc. 235(2), 144–162 (2009).
[Crossref] [PubMed]

J. Phys. Theor. Appl. (1)

G. Lippmann, “Epreuves reversibles donnant la sensation du relief,” J. Phys. Theor. Appl. 7(1), 821–825 (1908).
[Crossref]

Opt. Express (6)

Opt. Lett. (2)

Optica (1)

Proc. SPIE (1)

C. Perwass and L. Wietzke, “Single lens 3D-camera with extended depth-of-field,” Proc. SPIE 8921, 892108 (2012).

Other (3)

R. Ng, “Digital Light Field Photography,” Stanford University (2006).

H. Chen, V. Sick, M. Woodward, and D. Burke, “Human Iris 3D Imaging using a micro-Plenoptic Camera,” in Bio-Optics: Design and Application (Optical Society of America, 2017), p. BoW3A–6.

M. Pluta, Advanced Light Microscopy. Principles and Basic Properties (Elsevier, 1988).

Supplementary Material (3)

NameDescription
» Visualization 1       Multi-view images of a bio sample
» Visualization 2       Multiview images of a bio sample
» Visualization 3       Refocusing of depth images from multi view images

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1
Fig. 1 Schematic layout of Fourier Integral Microscopy (FIMic). A collection of orthographic views (or EIs) is directly obtained. ROP is the reference object plane; RL1 and RL2 are the relay lenses, and CCD is the digital sensor.
Fig. 2
Fig. 2 Central view provided by the two microscopes when operating both at resolution limit of about 6.20 μm. The label Z indicates the distance from the ROP. Providing both microscopes the same resolution at the ROP, the DOF extends up to Z=+110 μm for the FIMic, but only up to Z=+40 μm for the IMic.
Fig. 3
Fig. 3 Central view provided by both microscopes when operating with the same DOF. The FIMic provides better resolution at the edge of its DOF than the IMic at its ROP.
Fig. 4
Fig. 4 Microimages and EIs for IMic. Panel (a) shows the microimages recorded directly by the digital sensor. Panel (b) is the complete set of EIs computed from the microimages. In this experiment each micro image was formed by 17x17 pixels, hence the same number of EIs is computed. Zoomed-in areas are magnified by factor of four.
Fig. 5
Fig. 5 Elemental images directly recorded in FIMic. The available area of the digital sensor is utilized for the recording of the N=δT f 1 /p f 2 elemental images.
Fig. 6
Fig. 6 Refocusing capabilities FIMic v.s. IMic. FIMic provides better resolution along the entire DOF.
Fig. 7
Fig. 7 Frame extracted from Visualization 1, a video of a marine nematode swimming through algae. Panel (a) shows the directly recorded EIs. In panel (b) there is a 3-times zoomed-in of the central EI. The red arrow points the nematode.

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

FOV=2 f MO f 2 f 1 N A MLA ,
δ= N T p f 2 f 1
ρ EI max{ N λ 2N A H ,2δ f 2 f MO f L f 1 }.
ρ EI λ 2N A H N.
DO F EI =λ N 2 N A H 2 +δ N N A H f 2 f MO f L f 1 .
DO F EI = 5 4 λ N A H 2 N 2 .
ρ View { λ 2N A H ,2 p M H }=μ λ N A H .
DO F View = λ N A H 2 + μp M H N A H =( 1+ μ 2 2 ) λ N A H 2 .
ρ EI = N 2μ ρ View and DO F EI = 5 N 2 4+2 μ 2 DO F View .
Z R =n f MO 2 f L ( f 2 f 1 ) 2 δ p ,

Metrics