Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

High-performance reflection-type augmented reality 3D display using a reflective polarizer

Open Access Open Access

Abstract

We propose a high-performance reflection-type augmented reality (AR) 3D display by using a reflective polarizer (RP). The RP functions as a reflective imaging device as well as an image combiner that combines the real scenes and the 3D images reconstructed by the integral imaging display unit. Benefiting from the flawless imaging of the RP, the proposed reflection-type AR system can achieve high-definition 3D display. A prototype based on the proposed reflection-type AR structure is developed, and it presents good 3D display effects and reflection-type AR performances. The developed prototype is very compact, as thin as 3.4 mm, which makes it be a potential candidate in stomatology and vehicle AR display.

© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

Corrections

Qiang Li, Wei He, Huan Deng, Fei-Yan Zhong, and Yue Chen, "High-performance reflection-type augmented reality 3D display using reflective polarizer: erratum," Opt. Express 29, 13519-13519 (2021)
https://opg.optica.org/oe/abstract.cfm?uri=oe-29-9-13519

1. Introduction

Augmented reality (AR) display which enriches the real world by overlaying virtual digital contents or information of interest is developed to help the observer concentrate on tasks more effectively and has been widely considered by researchers and the industry [1,2]. At present, various three-dimensional (3D) display technologies have been applied in the AR display to effectively combine real 3D objects and virtual 3D images, such as holographic display [3,4], volumetric display [5,6], super multi-view display [7,8] and integral imaging (II) display [911]. Among the various 3D display techniques, the II display converts the spatial information into the angular information which solves the vergence-accommodation conflict (VAC) problems. Furthermore, it is relatively compact and simple and has the least amount of hardware complexity, which is a sign that it can possibly to be integrated in AR system [12]. One type of AR display device is an optical see-through (OST) display and it allows observers to directly perceive the real physical world through a transparent image combiner. The OST display is frequently used in transparent imaging, such as near-eye display, perspective glasses and front windshield. The OST display can be realized by using freeform optics [13,14], waveguide scheme [1517], microstructure [18,19] and holographic optical element [2022]. The other type of AR display is a video see-through (VST) display [2325]. The VST display can digitally visualize computer-generated virtual information and pre-recorded real-world scenes by using a camera and it can be used for handheld and mobile devices.

There are not many researches on reflection-type AR 3D display where the real scene is seen through a reflective imaging device. The reflection-type AR display is suitable for the application in stomatology, vehicle-mounted side-view mirror, fitting room mirror, and so on. Especially in oral surgery, an oral mirror is usually used to reflect what is inside the oral cavity. If an oral mirror with AR effect which combines the virtual 3D images, such as the tooth roots and blood vessels inside the gingiva, and the real teeth, is used, oral surgeries may be safer. However, unfortunately, due to the limited space inside the oral cavity, sometimes it will be difficult for dentists to have clear views. Hence, a very compact reflective AR equipment is required to replace the common oral mirror.

To the best of our knowledge, our previous work in Ref. [26] is the first demonstration of reflection-type AR 3D display. In that work, the 3D image was reconstructed through the pinholes of the mirror-based pinhole array (MBPA), and the real object was reflected in mirror area of the MBPA. However, the reflected image wasn’t good enough because the pinholes intercepted the light from the real object. In addition, the resolution and brightness of the reconstructed 3D image were greatly limited due to the principle of the pinhole-type II display [2729]. There always exists an inherent trade-off among pinholes size and brightness in pinhole-type II display, and the low resolution of 3D image is still an urgent problem that needed to be solved [30,31].

In this paper, a high-performance reflection-type AR 3D display is proposed. It can achieve both high-definition 3D display without the light loss and flawless reflection imaging of the real scene. What’s more, the proposed system only needs a reflective polarizer (RP) in addition to the conventional lens-type II display unit, which ensures a very compact form factor. So, the proposed system will probably be a good helper in oral surgeries.

2. Structure and principle of the 2D/3D mixed frontal projection system

The schematic diagram of the proposed reflection-type AR display is shown in Fig. 1(a). It mainly consists of a 2D display screen with an absorbing polarizer (AP), a lens array and a RP. The 2D display screen and the lens array are set together as a conventional lens-type II display unit. The AP which transmits and absorbs polarized light is sandwiched in the lens-type II display. The RP that functions as a reflective image combiner is placed at the front of the system. The transmission axis of the RP is the same as that of the AP. The RP presents transparent effect when the polarization direction of incident light is paralleled to its transmission axis, as green arrows show in Fig. 1 (b). And it presents reflection effect when the polarization direction of incident light is orthogonal to the transmission axis, as yellow arrows show in Fig. 1 (b).

 figure: Fig. 1.

Fig. 1. (a) Schematic diagram of the proposed reflection-type AR 3D display, and (b) optical properties of the RP.

Download Full Size | PDF

 Figure 2 shows the propagation path of the light in the proposed reflection-type AR display. The light emitted from the 2D display screen passes through the AP to generate polarized light in the same direction as the transmission axis of the AP. The polarized light will not change the polarization direction when passing through the lens array. Since the transmission axis of the RP is the same as that of the AP, the polarized light is transmitted directly through the RP. Due to the modulation of the lens array, the transmitted light forms a virtual 3D image, as green light shows in Fig. 2 (a). The lights from the real scene are divided into two parts. One part, whose polarization direction is orthogonal to the transmission axis of the RP, is reflected by the RP. In this case, the RP images the real scene on the other side of the RP, as yellow light shows in Fig. 2 (b). While, the other part whose polarization direction is paralleled to the transmission axis of the RP is transmitted into the display system which cannot be seen by the viewer. Because of the polarization selectivity of the RP, the virtual 3D image is transmitted through the RP, while the real scene is reflected through the RP. As a result, the viewer in front of the display system can see the virtual 3D image and the real scene simultaneously.

 figure: Fig. 2.

Fig. 2. Light propagation path of the proposed reflection-type AR 3D display.

Download Full Size | PDF

Considering the light distribution of the virtual 3D image, the light rays emitted by the pixels of each elemental image are modulated by the corresponding lens. The light distribution of the ideal reconstructed light field can be expressed as [32]

$${I_i}\textrm{ = }\int\!\!\!\int\limits_{x,y} {P\textrm{(}x,y,u,v\textrm{)d}x\textrm{d}y} \textrm{ }$$
where (x, y) and (u, v) denote the position and direction of a light ray, respectively.

Considering the light distribution of the reflected image, the light rays radiated by a 3D object can be calculated according to the Lambert’s Law [33]

$${I_d}{\rm{ = }}{I_L}{k_d}cos\theta = {I_L}{k_d}(n \cdot L)$$
where Id is the light distribution of the object. IL is the intensity of the light source. kd is the reflection coefficient of the object surface ranges from 0 to 1, and $\theta$ is the angle between the incident ray (L) and the normal line (n).

The total light distribution of the reflected image can be expressed as

$${I_D}{\rm{ = }}\int {{I_d}} d\theta$$

The light distribution of the light reflected by the RP is half of the incident light due to the polarization selectivity. However, the spatial distribution of light reflected on a smooth surface is usually simulated by the power of the cosine function and can be calculated as

$${I_r}{\rm{ = }}\frac{1}{2}{I_D}{k_s}co{s^m}\theta$$
where Ir is the light distribution of reflected light received by the observer, ks is the reflection coefficient related to material properties and incident light wavelength, m is the convergence index of the reflected light related to the smoothness of the object surface.

Therefore, the light distribution of the proposed AR 3D light field can be expressed as

$${I_m}\textrm{ = }{I_i} + {I_r}\textrm{ = }\int\!\!\!\int\limits_{x,y} {P\textrm{(}x,y,u,v\textrm{)d}x\textrm{d}y} \textrm{ } + \frac{1}{2}{k_s}co{s^m}\theta \int {{I_L}{k_d}cos\theta } d\theta$$

3. Experimental results

In our experiment, we built up a proof-of-concept prototype of the proposed reflection-type AR 3D display shown in Fig. 3. The 2D display screen that we used is a liquid crystal display screen (Sony XZ2 Premium) which includes an AP inside already. The 2D display screen is 5.8 inch with a resolution of 3840 × 2160 pixels and the pixel pitch of 0.033 mm. The lens array with the lens pitch of 1 mm and the focal length of 3.3 mm is 2.4 mm away from the 2D display screen, so that a virtual-mode II display unit is carried out. According to the Gaussian imaging formula, the central depth plane (CDP) of the virtual-mode II display unit is set 8.8 mm behind the 2D display screen. If the imaging depth of the real object is coincided with the CDP, the reflected image of the real object could work with the virtual 3D image to achieve AR 3D display effect. The RP with a thickness of 80 μm is attached on the flat substrate of the lens array. The detailed specifications of the reflection-type AR 3D display prototype are listed Table 1. Figure 4 shows the transmittance curve and reflectance curve of the RP within visible light range. The transmittance of transmitted light and reflectance of reflected light are up to 88% and 90%, respectively.

 figure: Fig. 3.

Fig. 3. Experimental prototype of the reflection-type AR 3D display.

Download Full Size | PDF

 figure: Fig. 4.

Fig. 4. The transmittance (a) and reflectance curves (b) of the RP within visible light range. Tt and Tr are the transmittance of transmitted light and reflected light, respectively. Rt and Rr are the reflectance of transmitted light and reflected light.

Download Full Size | PDF

Tables Icon

Table 1. Specifications of the developed reflection-type AR 3D display prototype.

 Figure 5 shows different perspectives of the combined AR 3D content consisting of the reconstructed virtual tooth and tooth roots (green color) and the mirror image of the real teeth and gingiva (white and pink colors). When the views of the observers moving from left to right, it is clear that the two virtual tooth roots turn into four and the virtual tooth also presents obvious parallax which confirms the successful 3D reconstruction, as shown in Figs. 5(a) and (b). At the same time, the mirror images of the real teeth and gingiva are flawless.

 figure: Fig. 5.

Fig. 5. Different perspectives of the combined images presented by the prototype. (a) Left view and (b) right view.

Download Full Size | PDF

We carried out three more experiments to present the applications of prototype in vehicle-mounted side-view mirror, mirror of zoo exhibition and fitting room mirror. We also compared the results with the previous system in Ref. [26]. As shown in Fig. 6, the real objects are a “white car”, a “leaf” and a “human head” whereas the virtual contents are a “road sign”, a “giraffe” and a “dress”, respectively. Comparing the experimental results in Figs. 6(d) – 6(f) and Figs. 6(g) – 6(i), the proposed system is greatly superior to the previous system in terms of the brightness and resolution. To assess the resolution improvement, 1951 USAF resolution test chart is used as the virtual 3D image. Figure 7 shows the reconstructed result on the CDP. The resolution of the proposed projection-type AR system is up to 7.13 lp/mm, while that of the previous system is 1.78 lp/mm. Therefore, the resolution has been improved by four times.

 figure: Fig. 6.

Fig. 6. The experiment results in different application scenes. (a)-(c) Elemental imaging array (EIA), AR combined display effects of the (d)-(f) proposed system and (g)-(i) previous system.

Download Full Size | PDF

 figure: Fig. 7.

Fig. 7. 3D images of the resolution test target reconstructed by (a) the proposed system and (b) the previous system.

Download Full Size | PDF

The thickness of the proposed reflection-type AR display could be much thin. For example, the 2D display screen could be as thin as 0.01 mm, like the OLED panels [34], and the AP and RP are dozens of microns. Therefore, the total thickness of the whole system is mainly decided by the thickness of the lens array which could be less than 1 mm and the gap g between the lens array and the 2D display screen. However, g can be set to get the depth of CDP for a specific application to coincide with the depth of the reflected image. Take oral surgery for example, the thickness of an oral mirror is about 3.5 mm [35] and the total thickness of our prototype is less than 3.4 mm. Therefore, the proposed system can be applied to oral surgery. A vision-based tracking method is proposed to realize the registration of AR and advance the research and development of this system for practical application [36,37].

4. Conclusions

This paper presents a high-performance reflection-type AR 3D display. By the polarization selectivity of the RP, the proposed system can combine the 3D image reconstructed by the II display unit with the reflected image of a real scene. The proposed system has the advantages of very compact form factor as well as high-definition, high brightness and flawless reflection. The experimental results indicate that the proposed system performs very good AR 3D effect. Compared with the previous system, the image quality of reflected images and the resolution and brightness of the 3D images are significantly improved. The proposed system has an extensive application prospect in stomatology and vehicle display.

Funding

National Natural Science Foundation of China (61775151); National Key Research and Development Program of China (2017YFB1002900); Innovative Spark Project of Sichuan University (2018SCUH0003).

Disclosures

The authors declare no conflicts of interest.

References

1. R. Azuma, Y. Baillot, R. Behringer, S. Feiber, S. Julier, and B. MacIntyre, “Recent advances in augmented reality,” IEEE Comput. Graph. Appl. 21(6), 34–47 (2001). [CrossRef]  

2. F. Zhou, H. B. L. Duh, and M. Billinghurst, “Trends in augmented reality tracking, interaction and display: a review of ten years of ISMAR,” Proc. of 7th IEEE/ACM International Symposium on Mixed and Augmented Reality (2008), pp. 193–202.

3. G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Lett. 41(11), 2486–2489 (2016). [CrossRef]  

4. T. S. Wesselius, J. W. Meulstee, G. Luijten, T. Xi, T. J. J. Maal, and D. J. Ulrich, “Holographic augmented reality for DIEP Flap Harvest,” Plast. Reconstr. Surg. 147(1), 25e–29e (2021). [CrossRef]  

5. K. Rathinavel, H. Wang, A. Blate, and H. Fuchs, “An extended depth-at-field volumetric near-eye augmented reality display,” IEEE Trans. Vis. Comput. Graph. 24(11), 2857–2866 (2018). [CrossRef]  

6. K. Suzuki, Y. Fukano, and H. Oku, “1000-volume/s high-speed volumetric display for high-speed HMD,” Opt. Express 28(20), 29455–29468 (2020). [CrossRef]  

7. Y. Takaki and N. Nago, “Multi-projection of lenticular displays to construct a 256-view super multi-view display,” Opt. Express 18(9), 8824–8835 (2010). [CrossRef]  

8. Y. Takaki, K. Tanaka, and J. Nakamura, “Super multi-view display with a lower resolution flat-panel display,” Opt. Express 19(5), 4129–4139 (2011). [CrossRef]  

9. X. Shen and B. Javidi, “Large depth of focus dynamic micro integral imaging for optical see-through augmented reality display using a focus-tunable lens,” Appl. Opt. 57(7), B184–B189 (2018). [CrossRef]  

10. J. Wang, H. Suenaga, H. G. Liao, K. Hoshi, L. j. Yang, E. Kobayashi, and I. Sakuma, “Real-time computer-generated integral imaging and 3D image calibration for augmented reality surgical navigation,” Comput. Med. Imaging Graph. 40, 147–159 (2015). [CrossRef]  

11. B. Javidi, A. Carnicer, J. Arai, T. Fujii, H. Hua, H. Liao, M. Martínez-Corral, F. Pla, A. Stern, L. Waller, Q. H. Wang, G. Wetzstein, M. Yamaguchi, and H. Yamamoto, “Roadmap on 3D integral imaging: sensing, processing, and display,” Opt. Express 28(22), 32266–32293 (2020). [CrossRef]  

12. H. Hua, “Enabling focus cues in head-mounted displays,” Proc. IEEE 105(5), 805–824 (2017). [CrossRef]  

13. S. Yamazaki, K. Inoguchi, Y. Saito, H. Morishima, and N. Taniguchi, “Thin widefield-of-view HMD with freeform- surface prism and applications,” Proc. SPIE 3639, 453–462 (1999). [CrossRef]  

14. H. Huang and H. Hua, “High-performance integral-imaging-based light field augmented reality display using freeform optics,” Opt. Express 26(13), 17578–17590 (2018). [CrossRef]  

15. R. Shi, J. Liu, H. Zhao, Z. Wu, Y. Liu, Y. Hu, Y. Chen, J. Xie, and Y. Wang, “Chromatic dispersion correction in planar waveguide using one-layer volume holograms based on three-step exposure,” Appl. Opt. 51(20), 4703–4708 (2012). [CrossRef]  

16. J. Yang, P. Twardowski, P. Gérard, and J. Fontaine, “Design of a large field-of-view see-through near to eye display with two geometrical waveguides,” Opt. Lett. 41(23), 5426–5429 (2016). [CrossRef]  

17. C. M. Bigler, P. A. Blanche, and K. Sarma, “Holographic waveguide heads-up display for longitudinal image magnification and pupil expansion,” Appl. Opt. 57(9), 2007–2013 (2018). [CrossRef]  

18. K. Sarayeddine and K. Mirza, “Key challenges to affordable see-through wearable displays: the missing link for mobile AR mass deployment,” Proc. SPIE 8720, 87200D (2013). [CrossRef]  

19. K. Sarayeddline, K. Mirza, P. Benoit, and X. Hugel, “Monolithic light guide optics enabling new user experience for see-through AR glasses,” Proc. SPIE 9202, 92020E (2014). [CrossRef]  

20. J. Yeom, J. Jeong, C. Jang, K. Hong, S. G. Park, and B. Lee, “Reflection-type integral imaging system using a diffuser holographic optical element,” Opt. Express 22(24), 29617–29626 (2014). [CrossRef]  

21. M. H. Choi, Y. G. Ju, and J. H. Park, “Holographic near-eye display with continuously expanded eyebox using two-dimensional replication and angular spectrum wrapping,” Opt. Express 28(1), 533–547 (2020). [CrossRef]  

22. H. Deng, C. Chen, M. Y. He, J. J. Li, H. L. Zhang, and Q. H. Wang, “High-resolution augmented reality 3D display with use of a lenticular lens array holographic optical element,” J. Opt. Soc. Am. A 36(4), 588–593 (2019). [CrossRef]  

23. J. P. Rolland and H. Fuchs, “Optical versus video see-through head-mounted displays in medical visualization,” Presence-Teleop. Virt. 9(3), 287–309 (2000). [CrossRef]  

24. F. Cutolo, P. D. Parchi, and V. Ferrari, “Video see through AR headmounted display for medical procedures,” In IEEE International Symposium on Mixed and Augmented Reality (2014), pp. 393–396.

25. C. H. Hsieh and J. D. Lee, “Markerless augmented reality via stereo video see-through head-mounted display device,” Math. Probl. Eng. 2015, 1–13 (2015). [CrossRef]  

26. Q. Li, H. Deng, S. L. Pang, W. H. Jiang, and Q. H. Wang, “A Reflective Augmented Reality Integral Imaging 3D Display by Using a Mirror-Based Pinhole Array,” Appl. Sci. 9(15), 3124 (2019). [CrossRef]  

27. C. B. Burckhardt, “Optimum parameters and resolution limitation of integral photography,” J. Opt. Soc. Am. 58(1), 71–76 (1968). [CrossRef]  

28. T. Okoshi, “Optimum design and depth resolution of lens-sheet and projection-type three-dimensional displays,” Appl. Opt. 10(10), 2284–2291 (1971). [CrossRef]  

29. H. Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. A 15(8), 2059–2065 (1998). [CrossRef]  

30. J. H. Jung, S. G. Park, Y. Kim, and B. Lee, “Integral imaging using a color filter pinhole array on a display panel,” Opt. Express 20(17), 18744–18756 (2012). [CrossRef]  

31. S. Choi, Y. Takashima, and S. W. Min, “Improvement of fill factor in pinhole-type integral imaging display using a retroreflector,” Opt. Express 25(26), 33078–33087 (2017). [CrossRef]  

32. M. Levoy and P. Hanrahan, “Light field rendering,” in Proc. SIGGRAPH (1996), pp. 31–42.

33. M. Oren and S. K. Nayar, “Seeing beyond Lambert's law,” In Proceedinge of the European Conference on Computer Vision (1994), pp. 269–280.

34. https://www.royole.com/flexible-display.

35. https://www.treedental.com/oral-mirror-with-led-light-tr-omm02.html

36. A. State, G. Hirota, D. T. Chen, W. E. Garrett, and M. Livingston, “Superior Augmented Reality Registration by Integrating Landmark Tracking and Magnetic Tracking,” In Proceedings of the 23rd annual conference on Computer graphics and interactive techniques (1996), pp. 429–438.

37. J. Carmigniani, B. Furht, M. Anisetti, P. Ceravolo, E. Damiani, and M. Ivkovic, “Augmented reality technologies, systems and applications,” Multimed. Tools Appl. 51(1), 341–377 (2011). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. (a) Schematic diagram of the proposed reflection-type AR 3D display, and (b) optical properties of the RP.
Fig. 2.
Fig. 2. Light propagation path of the proposed reflection-type AR 3D display.
Fig. 3.
Fig. 3. Experimental prototype of the reflection-type AR 3D display.
Fig. 4.
Fig. 4. The transmittance (a) and reflectance curves (b) of the RP within visible light range. Tt and Tr are the transmittance of transmitted light and reflected light, respectively. Rt and Rr are the reflectance of transmitted light and reflected light.
Fig. 5.
Fig. 5. Different perspectives of the combined images presented by the prototype. (a) Left view and (b) right view.
Fig. 6.
Fig. 6. The experiment results in different application scenes. (a)-(c) Elemental imaging array (EIA), AR combined display effects of the (d)-(f) proposed system and (g)-(i) previous system.
Fig. 7.
Fig. 7. 3D images of the resolution test target reconstructed by (a) the proposed system and (b) the previous system.

Tables (1)

Tables Icon

Table 1. Specifications of the developed reflection-type AR 3D display prototype.

Equations (5)

Equations on this page are rendered with MathJax. Learn more.

I i  =  x , y P ( x , y , u , v )d x d y  
I d = I L k d c o s θ = I L k d ( n L )
I D = I d d θ
I r = 1 2 I D k s c o s m θ
I m  =  I i + I r  =  x , y P ( x , y , u , v )d x d y   + 1 2 k s c o s m θ I L k d c o s θ d θ
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.