Abstract

The imaging principles and phenomena of integral imaging technique have been studied in detail using geometrical optics, wave optics, or light filed theory. However, most of the conclusions are only suit for the integral imaging systems using diffused illumination. In this work, a kind of twin imaging phenomenon and mechanism has been observed in a non-diffused illumination reflective integral imaging system. Interactive twin images including a real and a virtual 3D image of one object can be activated in the system. The imaging phenomenon is similar to the conjugate imaging effect of hologram, but it base on the refraction and reflection instead of diffraction. The imaging characteristics and mechanisms different from traditional integral imaging are deduced analytically. Thin film integral imaging systems with 80μm thickness have also been made to verify the imaging phenomenon. Vivid lighting interactive twin 3D images have been realized using a light-emitting diode (LED) light source. When the LED is moving, the twin 3D images are moving synchronously. This interesting phenomenon shows a good application prospect in interactive 3D display, argument reality, and security authentication.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Light field display is a promising three dimensional (3D) display technology. It stores, indexes, and outputs the light rays emitting from 3D images, and gives the observer both intensity and directional information of the 3D scene. Light field display is a kind of true 3D display system that provides a natural 3D experience. Integral imaging (II) is the first technology to record and display the light field using a micro-lens array (MLA) and elemental images (EIs), which was originally proposed by Lippman in 1908 [1]. By this time the image quality of II has been improved remarkably [2,3]. High resolution, wide view angle, and large depth of field (DOF) 3D images have been realized [4–7]. Recently, see through II systems using multiple MLAs or holographic optical elements were proposed for augmented reality application [8–12]. Lighting responsive II system that responds to the illumination environment automatically is also demonstrated [13]. The application of II is spreading further.

The imaging principles and phenomena of II technique seem clear. The imaging rules have been studied in detail using geometrical optics, wave optics, or light filed theory [14–22]. Imaging phenomena accompanied with the II technique including pseudoscopic imaging effect, crosstalk imaging effect, aliasing effect, and multi view zone effect have also been solved or used to optimize the 3D performance. The pseudoscopic imaging effect and the crosstalk imaging effect have been eliminated using optical or digital methods [23–26]. The aliasing effect is used to optimize the depth information [22, 27]. The multi view zone effect is also used to expand the view angle [25, 28]. However, almost all of the principles and phenomena of II technique are observed in diffused illumination II systems.

In this work, different from these imaging phenomena of the II technique mentioned above, a kind of twin imaging phenomenon has been observed in a non-diffused illumination reflective II system. Two copies of 3D images are activated using one set of EIs and MLA. One copy is a real 3D image and the other is a virtual 3D image of the same 3D object. The imaging phenomenon is similar to the conjugate imaging effect of hologram. But, the imaging mechanism of the twin imaging phenomenon is based on the refraction and reflection instead of diffraction. In such system, there are two rays with different direction that pass through each pixel of EIs and synthesize the twin 3D images respectively. In addition, the twin 3D images can respond to the illumination condition. When the light source is moving, the twin 3D images are moving synchronously. The imaging characteristics and mechanisms are deduced analytically first and then the twin imaging II systems are made to verify the imaging phenomenon.

2. Twin imaging phenomenon of II

Twin imaging phenomenon of II is illustrated in Fig. 1. A reflective MLA with a focal length of f is used in the II system. The EIs are a star array that couple with the MLA to form the 3D images. A point light source is used to illuminate the system. In this illumination condition, an image array of the point light source will be focused near the focal plane of the MLA. The EIs are situated in a plane between the MLA and the image array of the light source. Light rays emitted from the light source pass through the EIs two times. The first time is when the light rays depart from the light source and penetrate the EIs directly. The second time is when the penetration rays are reflected from the MLA and then go though the EIs again. These rays will be focused in the image point of the light source and then diverge to synthesize the 3D images. For a specific light ray, the two penetration points in the EIs are different. For a specific pixel in the EIs, there are two rays from different direction penetrating through it. One of the two rays from the pixel synthesizes a real 3D image, and the other ray synthesizes a virtual 3D image. When the illumination direction is changed, both of the directions of rays passing through the pixels are changed and the positions of the twin 3D images are varied. When the light illuminates the system obliquely, the twin images will separate in the 3D space.

 

Fig. 1 Twin imaging phenomenon of II.

Download Full Size | PPT Slide | PDF

3. Imaging mechanism of the twin imaging phenomenon

Quantitative mechanism of the twin imaging phenomenon is shown in Fig. 2. To simplify the system, a cross section in XZ plane of the II system is given. The Z axis of the coordinate system coincides with the symmetry axis of the imaging system and the XY plane coincides with the image plane of the light source. The point light source is set in a distance ZL axial away from X axis and a distance XL transverse away from Z axis. The EIs and the MLA are set in a distance g and F away from X axis, respectively. The period of the EIs and the MLA are Pe and Pl respectively. A real 3D image is shown in a distance Z’I axial away from X axis and a distance X’I transverse away from Z axis as marked with red color. Meanwhile, a virtual 3D image is shown in a distance ZI axial away from X axis and a distance XI transverse away from Z axis as marked with blue color. For a specific pixel D in the center of the central elemental image, the red light ray departs from the light source, and penetrates the pixel, and then is reflected by the MLA, and then passes through EIs and the imaging point of light source B, in succession, at last the ray converges at the real 3D image. For the pixel D, there is another blue light ray from the light source that passes through the EI, and is reflected by the MLA, and penetrates the pixel, and then intersects in the point B. The extension line of the ray synthesizes the virtual 3D image. Likewise, for the specific pixel C in the Nth elemental image nearby, the red light ray and the blue light ray passes through the pixel synthesizes the same real and virtual 3D image points respectively as the pixel D. For calculation convenience, the pixel C is chosen for that the incident angle of the red light ray passed through it equals the reflection angle of the red light ray reflected from the MLA. According to the geometric relations between the light source and 3D images, the relations between parameters ZL and ZI, or ZL and Z’I are given as follows:

ZI=g1(ZL+F)PePl(ZL+2F)
ZI'=F12(ZL+2F)Pl(ZL+F)Pe(ZL+2F)(ZL+g)Pl(ZL+F)
Relations between parameters XL and XI, or XL and X’I are given as follows:

 

Fig. 2 Geometrical relationships among the light source and the twin 3D images

Download Full Size | PPT Slide | PDF

XI=(ZIg)Fg(ZL+F)XL
XI'=NPlFZI'(ZL+F)(1+F)NPeF(ZL+g)(ZI'+F)ZI'+F+1F+ZLXL
N=(ZI'1)(ZI'+F)(ZL+g)FXL[(2F+ZI'+ZL)FPl+Pl(ZL+F)ZI'](ZL+g)(ZI'+F)(ZL+F)2(F+1)PeZI'

Equations (1) and (2) show that the Z axis coordinates of the twin 3D images are different. The twin 3D images are separated in Z axis. They are the functions of the illumination parameter of the light source ZL and structure parameters of the imaging system g, f, Pe, and Pl. Coordinates of 3D images in Z axis are independent of the transverse coordinates of the light source. Equations (3) - (5) show that the coordinates in X axis of the twin 3D images are functions of the illumination parameters of light source XL and ZL, structure parameters of the imaging system g, f, Pe, and Pl, image parameters ZI and Z’I. For a given imaging system the structure parameters are fixed. The positions of the twin 3D images are function of the light source’s position. For example, II systems with the following structure parameters: Pe = 81μm, Pl = 80μm, f = 100μm, g = 40μm, 60μm, and 80μm respectively is studied. Figure 3 shows that the Z axis position of the twin 3D images changes as the variation of Z axis position of the light source. For the real 3D image Z’I is positive, and for the virtual 3D image ZI is negative. When the light source is moved away from the imaging system, the twin images become close to the II system and the distance between the real and virtual 3D image is decreased. As the light source’s axial position satisfies the condition ZL < 100mm, the position variation of the twin 3D images is quick. As the light source’s axial position satisfies the condition ZL >100mm, the position variation of the twin 3D images is slow and trends to reach a steady state. Practically, the light source’s position usually meets the condition ZL >100mm, the twin 3D images separate in Z axis and the distance between the real and virtual image is relatively steady. Figure 3 also shows that the virtual 3D image’s position is more sensitive to the structural parameter g.

 

Fig. 3 Z axis positions of the twin 3D images vs. Z axis positions of the light source.

Download Full Size | PPT Slide | PDF

Figure 4 shows that the X axis position of the twin 3D images changes as the variation of the X axis position of the light source. The X axis coordinate of the twin 3D images increases nearly linearly as the increasing of the X axis coordinate of the light source. The movement direction of the real 3D image and the virtual 3D image is opposite. In addition, the parameters X’I and XI are restricted by the view angle of the II system. The view angle of the proposed II system 2θ is similar as that of traditional II system, where 2θ = 2atan(Pl /2/F) [16]. The max values of parameters X’I and XI are given by X’Imax = Z’I tan(θ), and XImax = ZI tan(θ) respectively. According to the geometrical symmetry, in the view angle of the system, the off-axis illumination in Y axis will make the twin 3D images separated in Y direction.

 

Fig. 4 X axis positions of the twin 3D images vs. X axis positions of the light source.

Download Full Size | PPT Slide | PDF

Section 4 describes the fabrication of the II system. An II system with structure parameters: Pe = 81μm, Pl = 80μm, f = 100μm, g = 60μm was made. A LED was used as a point light source to illuminate the II system. The coordinates of the twin images were measured. For the real image, a ground glass was used to capture and locate the Z axis coordinate. But, the Z axis coordinate of the virtual image wasn’t given for its positioning difficulty. To measure the X axis coordinates of the twin images, Z axis coordinate of the LED is set to 250mm and then the LED is moved along the X axis. As shown in Fig. 3 and Fig. 4, the measured coordinates of the images are in well agreement with the theoretical values.

The display structure and the point light source enable the II system to show the twin 3D images. The moving point light source results in the moving twin 3D images. The position of the point light source is restricted by the view angle of the system. Parameters ZI and XI of the point light source should make the 3D images within the system’s view angle. The DOF of the proposed II system DDOF is similar to that of traditional II system, where DDOF = 2F (Pl / PEIs), and PEIs is the pixel size of the EIs. The focal length of the MLA will affect the DOF, view angle as well as the imaging position of the II system. For a fixed elemental lens’s aperture and EIs, larger focal length means a deeper DOF and image position but a narrower view angle.

All the calculations involved in Eqs. (1) - (5) are based on the geometric optics principles and the conclusions are limited to the imaging system with negligible diffusion light and diffraction light from EIs. In practice, a low scattering ink is preferred to print the EIs in a high transparent substrate material. There is shading effect between the twin images. For each ray will penetrate two different points in the EIs consecutively, the pixel in the second point will shade the pixel in the first point. To get a clear twins imaging effect, a simple and transparent background EIs are preferred. The light source in the proposed II system is limited to the point light source or collimated light source. If extended light source is used, the twin 3D images are blurred. The collimated light source is a special case of the point light source. Light emitting from a point light source at infinity is the collimated light. In this condition, the parameter ZL in Eqs. (1) - (5) tends to infinity and these equations can be simplified as follows:

ZI=gPlPlPe
ZI'=FPlPePl
XI=(ZIg1)Ftan(α)
XI'=ZI'tan(α)
N=Ftan(α)Pl
Where α is the incident angle of the collimated light. According to Eqs. (6) and (7), the Z coordinate of the twin images is fixed under a collimated light illumination condition. X coordinate of the twin images are functions of the incident angle. Equation (7) shows that the Z coordinate of real 3D image in this condition is the same as that of real 3D image of conventional II system.

4. Fabrication and the experiments of the twin imaging II system

Thin film II systems were fabricated to verify the imaging mechanism. The thin film II systems were achieved by the steps suggested in Figs. 5(a)-5(g). First, micro-patterns were made in the photoresist using UV photolithography technique. Then the patterned photoresist for the MLA was thermal reflow to form the MLA. And then the nickel master molds of the MLA and EIs were made by electroplating. After that, the MLA and EIs molds were aligned and imprinted on both sides of the PET substrate using UV imprinting technique. Nano ink with sub-micro particle pigment was filled in the grooves of the EIs using a scraping knife system. At last, an aluminum film was deposited on the surface of the MLA to get the II system.

 

Fig. 5 Fabrication processes of the twin imaging II system. (a) UV photolithography, (b) Micro-patterns in photoresist after developing, (c) thermal reflow to form the MLA, (d) electroplating to make the MLA and EIs nickel master mold, (e) double side UV imprinting to duplicate and integrate the MLA and EIs, (f) scraping knife system to fill the nano ink in the grooves of the EIs, (g) deposition of a reflective layer.

Download Full Size | PPT Slide | PDF

The fabricated system consists of a refractive MLA with 50μm pitch and 100μm focal length. The EIs is a star array with 51μm pitch and a resolution up to 12,000 dpi. Each groove unit filled with nano ink acts as a pixel of EIs. The pixel’s feature size of the EIs is 2.0μm. Each elemental lens covers about 625 pixels. Figure 6 shows the 3D surface profile of the fabricated MLA and EIs. Blue nano ink is filled in the groove of the EIs using a scraping knife system similar to the gravure printing process. The thickness of the PET film and the II system is 50μm (Toyobo, A4300) and about 80μm, respectively. A layer of 100nm thick aluminum was coated on the MLA layer.

 

Fig. 6 3D surface profile of the fabricated MLA and EIs. (a) MLA, (b) EIs.

Download Full Size | PPT Slide | PDF

Figure 7 shows the performance of the fabricated II system in a diffused light illumination condition. The diffused light is got from a diffusive surface light source and the ambient light in a room. Array of stars are floating in the 3D space up side the II system. The 3D stars are a little blur due to the defocus setting of the EIs. The EIs is setting in the plane at a distance about 70μm from the MLA. To active the twin imaging phenomenon, a LED of emitter size 2 mm × 2 mm, configured in a mobile phone, has been used as a point light source to illuminate the II system. As shown in Visualization 1 (Multimedia view), when the LED is turned on, vivid real and virtual 3D images are separated in 3D space. When the LED gradually moves around the II system, the twin 3D images move synchronously as predicted in the Eqs. (3) - (5). In the experiments, the LED’s Z axis coordinate is around 250mm, which meets the condition that the Z axis distance between the twin 3D images is steady as the LED’s position varies in Z axis. Figure 8 shows two frames excerpted from Visualization 1. The images are captured when the LED is lighted from top left and top right corner of the II system respectively. Figure 8 and the Visualization 1 show that the twin images are separated in a plane clearly. To prove the twin images separated in Z axis, a diffuser was inserted to the real image plane of the II system. If the twin images are in the same plane, both images are clear. If the twin images are in the different planes, the image out of the diffuser plane is blurred. Figure 9 shows that only one image is captured in the diffuser area. When we move the diffuser along the Z axis in the real image field of the II system, only one clear image is captured. These facts indicate that the twin image separated in 3D space and one of the twin images is real image and the other is virtual image. The camera to record the images and video is a mobile phone camera with F-number 2.4 and effective focal length 3.8 mm. The distance between the camera and the II system is about 300 mm. The Z axis distance between the twin images is about 6.5 mm. The twin images are in the DOF of the camera and the captured twin images are all in focus at the same time.

 

Fig. 7 Performance of the fabricated II system illuminated by the diffused light.

Download Full Size | PPT Slide | PDF

 

Fig. 8 Two frames excerpted from Visualization 1 show the twin images of the II system.

Download Full Size | PPT Slide | PDF

 

Fig. 9 One clear image is captured in the diffuser area.

Download Full Size | PPT Slide | PDF

According to the Visualization 1, the movement range of the real 3D image seams bigger than that of the virtual image. The reasons are as follows: (i) Z axis coordinate of the twin images is not symmetrical to the plane of the II system according to Eqs. (1) and (2). (ii) The variation of the twin images’ X axis coordinates is not symmetrical to the axis of the II system according to Eqs. (3) and (4). (iii) According to the Eqs. (1) and (2), the Z axis positions of the twin 3D images are ZI = −1.5mm, Z’I = 5mm respectively. The virtual image is closer to the II system than the real image, so the movement range is smaller.

5. Discussion

The fabricated II systems showing vivid twin 3D images demonstrated the twin imaging phenomenon and the mechanism of II. The fabrication processes shown in Fig. 5 is one of the methods to make the twin imaging II system. The main advantage of this method is that an ultra-compact II system with dozens of micrometers thickness can be made using an ultra-high resolution EIs. The imprinting process is the key technique to make the EIs with resolution up to 12,000 dpi. Such high resolution image cannot be made using conventional display device or printing method. The intriguing interactive imaging phenomenon, compact structure, and fabrication difficulty of the proposed II system make it especially suit for security authentication and 3D decoration application. If the thickness of the II system can be reduced down to 50μm, it can be integrated in banknotes, currency, or other valuable documents/products as a security device potentially. The security feature can be verified using a point light source such as LED configured in a smart phone, which is popular worldwide.

The twin imaging II system can also be made using conventional passive light-emitting display panel such as liquid crystal display (LCD). The system structure is the same as that in Fig. 1. Different from the system in Fig. 7, a LCD is used to show the EIs. The resolution of the LCD is relative low. However, the EIs can be updated dynamically. Such dynamic twin imaging II systems may be application in interactive 3D display, argument reality domain. On the other hand, the mechanism of the twin imaging effect can also help us to avoid the twin imaging effect where a normal display mold is required in the reflective II system.

6. Conclusion

In summary, a kind of twin imaging phenomenon has been observed in the reflective II system. Illuminated by a point light source, vivid twin images including a real and a virtual 3D image of one object can be activated. The position of the twin 3D images can respond to the illumination condition synchronously. The imaging characteristics and mechanisms are given analytically, which expand the imaging principles of the II technique. In addition, thin film twin imaging II systems with 80μm thickness have been made to verify the imaging phenomenon. Interactive twin 3D images have been realized. This interesting phenomenon has potential application in interactive 3D display, argument reality, security authentication, and so on.

Funding

Science Startup Fund of Zhejiang Sci-Tech University (17062063 -Y), and (17062061-Y). Natural Science Foundation of Zhejiang Province (LQ18B040002).

References and links

1. G. Lippmann, “La photograhie integrale,” C. R. Acad. Sci. 146, 446–451 (1908).

2. S. Park, J. Yeom, Y. Jeong, N. Chen, J.-Y. Hong, and B. Lee, “Recent issues on integral imaging and its applications,” J. Inf. Disp. 15(1), 37–46 (2014). [CrossRef]  

3. X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications,” Appl. Opt. 52(4), 546–560 (2013). [CrossRef]   [PubMed]  

4. H. Liao, M. Iwahara, N. Hata, and T. Dohi, “High-quality integral videography using a multi-projector,” Opt. Express 12(6), 1067–1076 (2004). [CrossRef]   [PubMed]  

5. B. Lee, S. Jung, and J.-H. Park, “Viewing-angle-enhanced integral imaging by lens switching,” Opt. Lett. 27(10), 818–820 (2002). [CrossRef]   [PubMed]  

6. Y. Takaki and N. Nago, “Multi-projection of lenticular displays to construct a 256-view super multi-view display,” Opt. Express 18(9), 8824–8835 (2010). [CrossRef]   [PubMed]  

7. H. Liao, M. Iwahara, Y. Katayama, N. Hata, and T. Dohi, “Three-dimensional display with a long viewing distance by use of integral photography,” Opt. Lett. 30(6), 613–615 (2005). [CrossRef]   [PubMed]  

8. Y. Takaki and Y. Yamaguchi, “Flat-panel see-through three-dimensional display based on integral imaging,” Opt. Lett. 40(8), 1873–1876 (2015). [CrossRef]   [PubMed]  

9. Y. Yamaguchi and Y. Takaki, “See-through integral imaging display with background occlusion capability,” Appl. Opt. 55(3), A144–A149 (2016). [CrossRef]   [PubMed]  

10. G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Lett. 41(11), 2486–2489 (2016). [CrossRef]   [PubMed]  

11. C. Jang, C.-K. Lee, J. Jeong, G. Li, S. Lee, J. Yeom, K. Hong, and B. Lee, “Recent progress in see-through three-dimensional displays using holographic optical elements,” Appl. Opt. 55(3), A71–A85 (2016). [CrossRef]   [PubMed]  

12. X. Shen and B. Javidi, “Large depth of focus dynamic micro integral imaging for optical see-through augmented reality display using a focus-tunable lens,” Appl. Opt. 57(7), B184–B189 (2018). [CrossRef]   [PubMed]  

13. Y. Lou and J. Hu, “Passive lighting responsive three-dimensional integral imaging,” Opt. Commun. 402, 498–501 (2017). [CrossRef]  

14. C. B. Burckhardt, “Optimum parameters and resolution limitation of integral photography,” J. Opt. Soc. Am. 58(1), 71–76 (1967). [CrossRef]  

15. T. Okoshi, “Optimum design and depth resolution of lens-sheet and projection-type three-dimensional displays,” Appl. Opt. 10(10), 2284–2291 (1971). [CrossRef]   [PubMed]  

16. S.-W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Jpn. J. Appl. Phys. 44(2), 71–74 (2005). [CrossRef]  

17. J.-Y. Son, V. V. Saveljev, Y.-J. Choi, J.-E. Bahn, S.-K. Kim, and H.-H. Choi, “Parameters for designing autostereoscopic imaging systems based on lenticular, parallax barrier, and integral photography plates,” Opt. Eng. 42(11), 3326–3333 (2003). [CrossRef]  

18. B. Lee, J. Park, and H. Choi, “Scaling of Three-Dimensional Integral Imaging,” Jpn. J. Appl. Phys. 44(1), 216–224 (2005).

19. F. Okano, J. Arai, and M. Kawakita, “Wave optical analysis of integral method for three-dimensional images,” Opt. Lett. 32(4), 364–366 (2007). [CrossRef]   [PubMed]  

20. M. Levoy and P. Hanrahan, “Light field rendering,” SIGGRAPH 96, 31–42 (1996).

21. A. Isaksen, L. McMillan, and S. Gortler, “Dynamically reparameterized light fields,” In Proceedings of ACM SIGGRAPH 2000, Computer Graphics Proceedings, Annual Conference Series. 23(3), 297–306 (2000).

22. M. Zwicker, W. Matusik, F. Durand, and H. Pfister, “Antialiasing for automultiscopic 3D displays in Rendering Techniques,” Eurographics Workshop on Rendering. 13(7), 73–82 (2006).

23. H. Navarro, R. Martínez-Cuenca, G. Saavedra, M. Martínez-Corral, and B. Javidi, “3D integral imaging display by smart pseudoscopic-to-orthoscopic conversion (SPOC),” Opt. Express 18(25), 25573–25583 (2010). [CrossRef]   [PubMed]  

24. X. Xiao, X. Shen, M. Martinez-Corral, and B. Javidi, “Multiple-planes pseudoscopic-to-orthoscopic conversion for 3D integral imaging display,” J. Disp. Technol. 11(11), 921–926 (2015). [CrossRef]  

25. C. Luo, C. Ji, F. Wang, Y. Wang, and Q. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012). [CrossRef]  

26. Y. Wang, Q. Wang, D. Li, H. Deng, and C. Luo, “Crosstalk-free integral imaging display based on double plano-convex micro-lens array,” Chin. Opt. lett. 11(6), 61–101 (2013).

27. X. Wang, Q. Bu, and D. Zhang, “Method for quantifying the effects of aliasing on the viewing resolution of integral images,” Opt. Lett. 34(21), 3382–3384 (2009). [CrossRef]   [PubMed]  

28. H. Choi, S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Multiple-viewing-zone integral imaging using a dynamic barrier array for three-dimensional displays,” Opt. Express 11(8), 927–932 (2003). [CrossRef]   [PubMed]  

References

  • View by:
  • |
  • |
  • |

  1. G. Lippmann, “La photograhie integrale,” C. R. Acad. Sci. 146, 446–451 (1908).
  2. S. Park, J. Yeom, Y. Jeong, N. Chen, J.-Y. Hong, and B. Lee, “Recent issues on integral imaging and its applications,” J. Inf. Disp. 15(1), 37–46 (2014).
    [Crossref]
  3. X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications,” Appl. Opt. 52(4), 546–560 (2013).
    [Crossref] [PubMed]
  4. H. Liao, M. Iwahara, N. Hata, and T. Dohi, “High-quality integral videography using a multi-projector,” Opt. Express 12(6), 1067–1076 (2004).
    [Crossref] [PubMed]
  5. B. Lee, S. Jung, and J.-H. Park, “Viewing-angle-enhanced integral imaging by lens switching,” Opt. Lett. 27(10), 818–820 (2002).
    [Crossref] [PubMed]
  6. Y. Takaki and N. Nago, “Multi-projection of lenticular displays to construct a 256-view super multi-view display,” Opt. Express 18(9), 8824–8835 (2010).
    [Crossref] [PubMed]
  7. H. Liao, M. Iwahara, Y. Katayama, N. Hata, and T. Dohi, “Three-dimensional display with a long viewing distance by use of integral photography,” Opt. Lett. 30(6), 613–615 (2005).
    [Crossref] [PubMed]
  8. Y. Takaki and Y. Yamaguchi, “Flat-panel see-through three-dimensional display based on integral imaging,” Opt. Lett. 40(8), 1873–1876 (2015).
    [Crossref] [PubMed]
  9. Y. Yamaguchi and Y. Takaki, “See-through integral imaging display with background occlusion capability,” Appl. Opt. 55(3), A144–A149 (2016).
    [Crossref] [PubMed]
  10. G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Lett. 41(11), 2486–2489 (2016).
    [Crossref] [PubMed]
  11. C. Jang, C.-K. Lee, J. Jeong, G. Li, S. Lee, J. Yeom, K. Hong, and B. Lee, “Recent progress in see-through three-dimensional displays using holographic optical elements,” Appl. Opt. 55(3), A71–A85 (2016).
    [Crossref] [PubMed]
  12. X. Shen and B. Javidi, “Large depth of focus dynamic micro integral imaging for optical see-through augmented reality display using a focus-tunable lens,” Appl. Opt. 57(7), B184–B189 (2018).
    [Crossref] [PubMed]
  13. Y. Lou and J. Hu, “Passive lighting responsive three-dimensional integral imaging,” Opt. Commun. 402, 498–501 (2017).
    [Crossref]
  14. C. B. Burckhardt, “Optimum parameters and resolution limitation of integral photography,” J. Opt. Soc. Am. 58(1), 71–76 (1967).
    [Crossref]
  15. T. Okoshi, “Optimum design and depth resolution of lens-sheet and projection-type three-dimensional displays,” Appl. Opt. 10(10), 2284–2291 (1971).
    [Crossref] [PubMed]
  16. S.-W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Jpn. J. Appl. Phys. 44(2), 71–74 (2005).
    [Crossref]
  17. J.-Y. Son, V. V. Saveljev, Y.-J. Choi, J.-E. Bahn, S.-K. Kim, and H.-H. Choi, “Parameters for designing autostereoscopic imaging systems based on lenticular, parallax barrier, and integral photography plates,” Opt. Eng. 42(11), 3326–3333 (2003).
    [Crossref]
  18. B. Lee, J. Park, and H. Choi, “Scaling of Three-Dimensional Integral Imaging,” Jpn. J. Appl. Phys. 44(1), 216–224 (2005).
  19. F. Okano, J. Arai, and M. Kawakita, “Wave optical analysis of integral method for three-dimensional images,” Opt. Lett. 32(4), 364–366 (2007).
    [Crossref] [PubMed]
  20. M. Levoy and P. Hanrahan, “Light field rendering,” SIGGRAPH 96, 31–42 (1996).
  21. A. Isaksen, L. McMillan, and S. Gortler, “Dynamically reparameterized light fields,” In Proceedings of ACM SIGGRAPH 2000, Computer Graphics Proceedings, Annual Conference Series. 23(3), 297–306 (2000).
  22. M. Zwicker, W. Matusik, F. Durand, and H. Pfister, “Antialiasing for automultiscopic 3D displays in Rendering Techniques,” Eurographics Workshop on Rendering. 13(7), 73–82 (2006).
  23. H. Navarro, R. Martínez-Cuenca, G. Saavedra, M. Martínez-Corral, and B. Javidi, “3D integral imaging display by smart pseudoscopic-to-orthoscopic conversion (SPOC),” Opt. Express 18(25), 25573–25583 (2010).
    [Crossref] [PubMed]
  24. X. Xiao, X. Shen, M. Martinez-Corral, and B. Javidi, “Multiple-planes pseudoscopic-to-orthoscopic conversion for 3D integral imaging display,” J. Disp. Technol. 11(11), 921–926 (2015).
    [Crossref]
  25. C. Luo, C. Ji, F. Wang, Y. Wang, and Q. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012).
    [Crossref]
  26. Y. Wang, Q. Wang, D. Li, H. Deng, and C. Luo, “Crosstalk-free integral imaging display based on double plano-convex micro-lens array,” Chin. Opt. lett. 11(6), 61–101 (2013).
  27. X. Wang, Q. Bu, and D. Zhang, “Method for quantifying the effects of aliasing on the viewing resolution of integral images,” Opt. Lett. 34(21), 3382–3384 (2009).
    [Crossref] [PubMed]
  28. H. Choi, S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Multiple-viewing-zone integral imaging using a dynamic barrier array for three-dimensional displays,” Opt. Express 11(8), 927–932 (2003).
    [Crossref] [PubMed]

2018 (1)

2017 (1)

Y. Lou and J. Hu, “Passive lighting responsive three-dimensional integral imaging,” Opt. Commun. 402, 498–501 (2017).
[Crossref]

2016 (3)

2015 (2)

Y. Takaki and Y. Yamaguchi, “Flat-panel see-through three-dimensional display based on integral imaging,” Opt. Lett. 40(8), 1873–1876 (2015).
[Crossref] [PubMed]

X. Xiao, X. Shen, M. Martinez-Corral, and B. Javidi, “Multiple-planes pseudoscopic-to-orthoscopic conversion for 3D integral imaging display,” J. Disp. Technol. 11(11), 921–926 (2015).
[Crossref]

2014 (1)

S. Park, J. Yeom, Y. Jeong, N. Chen, J.-Y. Hong, and B. Lee, “Recent issues on integral imaging and its applications,” J. Inf. Disp. 15(1), 37–46 (2014).
[Crossref]

2013 (2)

X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications,” Appl. Opt. 52(4), 546–560 (2013).
[Crossref] [PubMed]

Y. Wang, Q. Wang, D. Li, H. Deng, and C. Luo, “Crosstalk-free integral imaging display based on double plano-convex micro-lens array,” Chin. Opt. lett. 11(6), 61–101 (2013).

2012 (1)

C. Luo, C. Ji, F. Wang, Y. Wang, and Q. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012).
[Crossref]

2010 (2)

2009 (1)

2007 (1)

2005 (3)

H. Liao, M. Iwahara, Y. Katayama, N. Hata, and T. Dohi, “Three-dimensional display with a long viewing distance by use of integral photography,” Opt. Lett. 30(6), 613–615 (2005).
[Crossref] [PubMed]

S.-W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Jpn. J. Appl. Phys. 44(2), 71–74 (2005).
[Crossref]

B. Lee, J. Park, and H. Choi, “Scaling of Three-Dimensional Integral Imaging,” Jpn. J. Appl. Phys. 44(1), 216–224 (2005).

2004 (1)

2003 (2)

J.-Y. Son, V. V. Saveljev, Y.-J. Choi, J.-E. Bahn, S.-K. Kim, and H.-H. Choi, “Parameters for designing autostereoscopic imaging systems based on lenticular, parallax barrier, and integral photography plates,” Opt. Eng. 42(11), 3326–3333 (2003).
[Crossref]

H. Choi, S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Multiple-viewing-zone integral imaging using a dynamic barrier array for three-dimensional displays,” Opt. Express 11(8), 927–932 (2003).
[Crossref] [PubMed]

2002 (1)

1996 (1)

M. Levoy and P. Hanrahan, “Light field rendering,” SIGGRAPH 96, 31–42 (1996).

1971 (1)

1967 (1)

1908 (1)

G. Lippmann, “La photograhie integrale,” C. R. Acad. Sci. 146, 446–451 (1908).

Arai, J.

Bahn, J.-E.

J.-Y. Son, V. V. Saveljev, Y.-J. Choi, J.-E. Bahn, S.-K. Kim, and H.-H. Choi, “Parameters for designing autostereoscopic imaging systems based on lenticular, parallax barrier, and integral photography plates,” Opt. Eng. 42(11), 3326–3333 (2003).
[Crossref]

Bu, Q.

Burckhardt, C. B.

Chen, N.

S. Park, J. Yeom, Y. Jeong, N. Chen, J.-Y. Hong, and B. Lee, “Recent issues on integral imaging and its applications,” J. Inf. Disp. 15(1), 37–46 (2014).
[Crossref]

Cho, J.

Choi, H.

Choi, H.-H.

J.-Y. Son, V. V. Saveljev, Y.-J. Choi, J.-E. Bahn, S.-K. Kim, and H.-H. Choi, “Parameters for designing autostereoscopic imaging systems based on lenticular, parallax barrier, and integral photography plates,” Opt. Eng. 42(11), 3326–3333 (2003).
[Crossref]

Choi, Y.-J.

J.-Y. Son, V. V. Saveljev, Y.-J. Choi, J.-E. Bahn, S.-K. Kim, and H.-H. Choi, “Parameters for designing autostereoscopic imaging systems based on lenticular, parallax barrier, and integral photography plates,” Opt. Eng. 42(11), 3326–3333 (2003).
[Crossref]

Deng, H.

Y. Wang, Q. Wang, D. Li, H. Deng, and C. Luo, “Crosstalk-free integral imaging display based on double plano-convex micro-lens array,” Chin. Opt. lett. 11(6), 61–101 (2013).

Dohi, T.

Durand, F.

M. Zwicker, W. Matusik, F. Durand, and H. Pfister, “Antialiasing for automultiscopic 3D displays in Rendering Techniques,” Eurographics Workshop on Rendering. 13(7), 73–82 (2006).

Hanrahan, P.

M. Levoy and P. Hanrahan, “Light field rendering,” SIGGRAPH 96, 31–42 (1996).

Hata, N.

Hong, J.-Y.

S. Park, J. Yeom, Y. Jeong, N. Chen, J.-Y. Hong, and B. Lee, “Recent issues on integral imaging and its applications,” J. Inf. Disp. 15(1), 37–46 (2014).
[Crossref]

Hong, K.

Hu, J.

Y. Lou and J. Hu, “Passive lighting responsive three-dimensional integral imaging,” Opt. Commun. 402, 498–501 (2017).
[Crossref]

Iwahara, M.

Jang, C.

Javidi, B.

Jeong, J.

Jeong, Y.

G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Lett. 41(11), 2486–2489 (2016).
[Crossref] [PubMed]

S. Park, J. Yeom, Y. Jeong, N. Chen, J.-Y. Hong, and B. Lee, “Recent issues on integral imaging and its applications,” J. Inf. Disp. 15(1), 37–46 (2014).
[Crossref]

Ji, C.

C. Luo, C. Ji, F. Wang, Y. Wang, and Q. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012).
[Crossref]

Jung, S.

Katayama, Y.

Kawakita, M.

Kim, J.

S.-W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Jpn. J. Appl. Phys. 44(2), 71–74 (2005).
[Crossref]

Kim, S.-K.

J.-Y. Son, V. V. Saveljev, Y.-J. Choi, J.-E. Bahn, S.-K. Kim, and H.-H. Choi, “Parameters for designing autostereoscopic imaging systems based on lenticular, parallax barrier, and integral photography plates,” Opt. Eng. 42(11), 3326–3333 (2003).
[Crossref]

Lee, B.

Lee, C.-K.

Lee, D.

Lee, S.

Levoy, M.

M. Levoy and P. Hanrahan, “Light field rendering,” SIGGRAPH 96, 31–42 (1996).

Li, D.

Y. Wang, Q. Wang, D. Li, H. Deng, and C. Luo, “Crosstalk-free integral imaging display based on double plano-convex micro-lens array,” Chin. Opt. lett. 11(6), 61–101 (2013).

Li, G.

Liao, H.

Lippmann, G.

G. Lippmann, “La photograhie integrale,” C. R. Acad. Sci. 146, 446–451 (1908).

Lou, Y.

Y. Lou and J. Hu, “Passive lighting responsive three-dimensional integral imaging,” Opt. Commun. 402, 498–501 (2017).
[Crossref]

Luo, C.

Y. Wang, Q. Wang, D. Li, H. Deng, and C. Luo, “Crosstalk-free integral imaging display based on double plano-convex micro-lens array,” Chin. Opt. lett. 11(6), 61–101 (2013).

C. Luo, C. Ji, F. Wang, Y. Wang, and Q. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012).
[Crossref]

Martinez-Corral, M.

X. Xiao, X. Shen, M. Martinez-Corral, and B. Javidi, “Multiple-planes pseudoscopic-to-orthoscopic conversion for 3D integral imaging display,” J. Disp. Technol. 11(11), 921–926 (2015).
[Crossref]

X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications,” Appl. Opt. 52(4), 546–560 (2013).
[Crossref] [PubMed]

Martínez-Corral, M.

Martínez-Cuenca, R.

Matusik, W.

M. Zwicker, W. Matusik, F. Durand, and H. Pfister, “Antialiasing for automultiscopic 3D displays in Rendering Techniques,” Eurographics Workshop on Rendering. 13(7), 73–82 (2006).

Min, S.-W.

S.-W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Jpn. J. Appl. Phys. 44(2), 71–74 (2005).
[Crossref]

H. Choi, S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Multiple-viewing-zone integral imaging using a dynamic barrier array for three-dimensional displays,” Opt. Express 11(8), 927–932 (2003).
[Crossref] [PubMed]

Nago, N.

Navarro, H.

Okano, F.

Okoshi, T.

Park, J.

B. Lee, J. Park, and H. Choi, “Scaling of Three-Dimensional Integral Imaging,” Jpn. J. Appl. Phys. 44(1), 216–224 (2005).

Park, J.-H.

Park, S.

S. Park, J. Yeom, Y. Jeong, N. Chen, J.-Y. Hong, and B. Lee, “Recent issues on integral imaging and its applications,” J. Inf. Disp. 15(1), 37–46 (2014).
[Crossref]

Pfister, H.

M. Zwicker, W. Matusik, F. Durand, and H. Pfister, “Antialiasing for automultiscopic 3D displays in Rendering Techniques,” Eurographics Workshop on Rendering. 13(7), 73–82 (2006).

Saavedra, G.

Saveljev, V. V.

J.-Y. Son, V. V. Saveljev, Y.-J. Choi, J.-E. Bahn, S.-K. Kim, and H.-H. Choi, “Parameters for designing autostereoscopic imaging systems based on lenticular, parallax barrier, and integral photography plates,” Opt. Eng. 42(11), 3326–3333 (2003).
[Crossref]

Shen, X.

X. Shen and B. Javidi, “Large depth of focus dynamic micro integral imaging for optical see-through augmented reality display using a focus-tunable lens,” Appl. Opt. 57(7), B184–B189 (2018).
[Crossref] [PubMed]

X. Xiao, X. Shen, M. Martinez-Corral, and B. Javidi, “Multiple-planes pseudoscopic-to-orthoscopic conversion for 3D integral imaging display,” J. Disp. Technol. 11(11), 921–926 (2015).
[Crossref]

Son, J.-Y.

J.-Y. Son, V. V. Saveljev, Y.-J. Choi, J.-E. Bahn, S.-K. Kim, and H.-H. Choi, “Parameters for designing autostereoscopic imaging systems based on lenticular, parallax barrier, and integral photography plates,” Opt. Eng. 42(11), 3326–3333 (2003).
[Crossref]

Stern, A.

Takaki, Y.

Wang, F.

C. Luo, C. Ji, F. Wang, Y. Wang, and Q. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012).
[Crossref]

Wang, Q.

Y. Wang, Q. Wang, D. Li, H. Deng, and C. Luo, “Crosstalk-free integral imaging display based on double plano-convex micro-lens array,” Chin. Opt. lett. 11(6), 61–101 (2013).

C. Luo, C. Ji, F. Wang, Y. Wang, and Q. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012).
[Crossref]

Wang, X.

Wang, Y.

Y. Wang, Q. Wang, D. Li, H. Deng, and C. Luo, “Crosstalk-free integral imaging display based on double plano-convex micro-lens array,” Chin. Opt. lett. 11(6), 61–101 (2013).

C. Luo, C. Ji, F. Wang, Y. Wang, and Q. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012).
[Crossref]

Xiao, X.

X. Xiao, X. Shen, M. Martinez-Corral, and B. Javidi, “Multiple-planes pseudoscopic-to-orthoscopic conversion for 3D integral imaging display,” J. Disp. Technol. 11(11), 921–926 (2015).
[Crossref]

X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications,” Appl. Opt. 52(4), 546–560 (2013).
[Crossref] [PubMed]

Yamaguchi, Y.

Yeom, J.

C. Jang, C.-K. Lee, J. Jeong, G. Li, S. Lee, J. Yeom, K. Hong, and B. Lee, “Recent progress in see-through three-dimensional displays using holographic optical elements,” Appl. Opt. 55(3), A71–A85 (2016).
[Crossref] [PubMed]

S. Park, J. Yeom, Y. Jeong, N. Chen, J.-Y. Hong, and B. Lee, “Recent issues on integral imaging and its applications,” J. Inf. Disp. 15(1), 37–46 (2014).
[Crossref]

Zhang, D.

Zwicker, M.

M. Zwicker, W. Matusik, F. Durand, and H. Pfister, “Antialiasing for automultiscopic 3D displays in Rendering Techniques,” Eurographics Workshop on Rendering. 13(7), 73–82 (2006).

Appl. Opt. (5)

C. R. Acad. Sci. (1)

G. Lippmann, “La photograhie integrale,” C. R. Acad. Sci. 146, 446–451 (1908).

Chin. Opt. lett. (1)

Y. Wang, Q. Wang, D. Li, H. Deng, and C. Luo, “Crosstalk-free integral imaging display based on double plano-convex micro-lens array,” Chin. Opt. lett. 11(6), 61–101 (2013).

J. Disp. Technol. (2)

X. Xiao, X. Shen, M. Martinez-Corral, and B. Javidi, “Multiple-planes pseudoscopic-to-orthoscopic conversion for 3D integral imaging display,” J. Disp. Technol. 11(11), 921–926 (2015).
[Crossref]

C. Luo, C. Ji, F. Wang, Y. Wang, and Q. Wang, “Crosstalk free integral imaging display with wide viewing angle using periodic black mask,” J. Disp. Technol. 8(11), 634–638 (2012).
[Crossref]

J. Inf. Disp. (1)

S. Park, J. Yeom, Y. Jeong, N. Chen, J.-Y. Hong, and B. Lee, “Recent issues on integral imaging and its applications,” J. Inf. Disp. 15(1), 37–46 (2014).
[Crossref]

J. Opt. Soc. Am. (1)

Jpn. J. Appl. Phys. (2)

S.-W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Jpn. J. Appl. Phys. 44(2), 71–74 (2005).
[Crossref]

B. Lee, J. Park, and H. Choi, “Scaling of Three-Dimensional Integral Imaging,” Jpn. J. Appl. Phys. 44(1), 216–224 (2005).

Opt. Commun. (1)

Y. Lou and J. Hu, “Passive lighting responsive three-dimensional integral imaging,” Opt. Commun. 402, 498–501 (2017).
[Crossref]

Opt. Eng. (1)

J.-Y. Son, V. V. Saveljev, Y.-J. Choi, J.-E. Bahn, S.-K. Kim, and H.-H. Choi, “Parameters for designing autostereoscopic imaging systems based on lenticular, parallax barrier, and integral photography plates,” Opt. Eng. 42(11), 3326–3333 (2003).
[Crossref]

Opt. Express (4)

Opt. Lett. (6)

SIGGRAPH (1)

M. Levoy and P. Hanrahan, “Light field rendering,” SIGGRAPH 96, 31–42 (1996).

Other (2)

A. Isaksen, L. McMillan, and S. Gortler, “Dynamically reparameterized light fields,” In Proceedings of ACM SIGGRAPH 2000, Computer Graphics Proceedings, Annual Conference Series. 23(3), 297–306 (2000).

M. Zwicker, W. Matusik, F. Durand, and H. Pfister, “Antialiasing for automultiscopic 3D displays in Rendering Techniques,” Eurographics Workshop on Rendering. 13(7), 73–82 (2006).

Supplementary Material (1)

NameDescription
» Visualization 1       The twin images including a real 3D image and a virtual 3D image of one object are activated using a LED point light source.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 Twin imaging phenomenon of II.
Fig. 2
Fig. 2 Geometrical relationships among the light source and the twin 3D images
Fig. 3
Fig. 3 Z axis positions of the twin 3D images vs. Z axis positions of the light source.
Fig. 4
Fig. 4 X axis positions of the twin 3D images vs. X axis positions of the light source.
Fig. 5
Fig. 5 Fabrication processes of the twin imaging II system. (a) UV photolithography, (b) Micro-patterns in photoresist after developing, (c) thermal reflow to form the MLA, (d) electroplating to make the MLA and EIs nickel master mold, (e) double side UV imprinting to duplicate and integrate the MLA and EIs, (f) scraping knife system to fill the nano ink in the grooves of the EIs, (g) deposition of a reflective layer.
Fig. 6
Fig. 6 3D surface profile of the fabricated MLA and EIs. (a) MLA, (b) EIs.
Fig. 7
Fig. 7 Performance of the fabricated II system illuminated by the diffused light.
Fig. 8
Fig. 8 Two frames excerpted from Visualization 1 show the twin images of the II system.
Fig. 9
Fig. 9 One clear image is captured in the diffuser area.

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

Z I = g 1 ( Z L +F) P e P l ( Z L +2F)
Z I ' = F 1 2( Z L +2F) P l ( Z L +F) P e ( Z L +2F)( Z L +g) P l ( Z L +F)
X I = ( Z I g)F g( Z L +F) X L
X I ' = N P l F Z I ' ( Z L +F)(1+F)N P e F( Z L +g)( Z I ' +F) Z I ' + F+1 F+ Z L X L
N= ( Z I ' 1)( Z I ' +F)( Z L +g)F X L [(2F+ Z I ' + Z L )F P l + P l ( Z L +F) Z I ' ]( Z L +g)( Z I ' +F) ( Z L +F) 2 (F+1) P e Z I '
Z I = g P l P l P e
Z I ' = F P l P e P l
X I =( Z I g 1)Ftan(α)
X I ' = Z I ' tan(α)
N= Ftan(α) P l

Metrics