Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

360-degree mixed reality volumetric display using an asymmetric diffusive holographic optical element

Open Access Open Access

Abstract

Volumetric display technique has a great advantage of displaying realistic three-dimensional contents with a 360-degree viewing angle. However, most volumetric displays cannot provide mixed reality because their screens inside the displays obstruct the external scene. We design a 360-degree mixed-reality volumetric display using an asymmetric diffusive holographic optical element (ADHOE). The ADHOE has wavelength selectivity, and it diffuses the light with the only specific wavelength for the virtual object, so it is possible to optically combine the virtual object and the real scene. Also, the ADHOE has different vertical and horizontal diffusing angles, and it is suitable for a horizontal-parallax-only application. In our system, the parallax images are generated by the DMD, and they are projected sequentially on the ADHOE. The ADHOE is shaped as a slanted curved surface with respect to the optical axis, and some annoying color dispersion is observed due to the mismatch between the diffraction peak points of two different wavelengths. In order to solve this problem, the carrier frequency is applied to green elemental images and the proper Fourier filter cuts off the unwanted diffraction peak points. The Fourier transform with 2f optics is built to record the ADHOE where the angular spectral bandwidth is determined by adjusting the width of the incident object light. A 360-degree see-through display with ADHOE is implemented and the feasibility of mixed reality is verified successfully.

© 2022 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Volumetric displays take a lot of interest and owing to the great potential to display realistic three-dimensional (3D) contents with 360-degree viewing angle [15]. Since the volumetric displays generate voxels composing 3D contents in their view volume, the users perceive that virtual objects exist in front of them. Recently, mixed reality is the prospective technology for user to experience the extended reality including both real and virtual worlds. For this purpose, it is important to provide high quality virtual objects for mixed with the real worlds. From this point of view, the volumetric display is expected to be one of the promising candidates.

In general, there are two methods for generating voxels in volumetric displays. One is to generate scattering points as voxels on the sweeping screen which is moving in the image space [68]. So, its view volume is limited by the movement of the screen. The other is to generate voxels apart from the engineered diffusing screen by using multiple projection optics [913]. The engineered diffusing screen means that the direction and the width of the diffusing angle are engineered depending on the position of the voxel and the viewing zone of the display. When voxels are positioned apart from the engineered diffusing screen, the view volume is not limited within the space swept by the screen but the view volume could be larger than it. Therefore, it is possible to place real objects inside of the view volume and there is the room to merge real world and virtual object together. There are some researches to display virtual object on the side of real objects in tabletop-type volumetric displays. S. Yoshida proposed a tabletop volumetric display composed of 288 projectors and conical rear-projection screen to form the view volume on the table [12]. Here, the conical screen is placed under the table and diffuses the light anisotropically. Meanwhile, M. Makiguchi proposed a tabletop volumetric display using a reflective concave screen and 60 projectors [13]. The view volume is also formed on the table. In both systems, it is possible to observe the real objects placed on the table in accordance with the virtual objects. However, the view is obscured by the tabletop since the real objects behind the table are not possible to be seen. In order to mix the real and virtual worlds naturally, it is necessary to show the virtual object in the real world. That is, the real scene behind the virtual objects is also expected to be watched together.

The cylinder shape is regarded as an appropriate form factor for mixed reality application. There are some obvious advantages to display the virtual objects with the real background scene because the viewing zone has the same height as the view volume in most cylindrical volume displays. T. Yendo proposed a cylindrical volumetric display that provides a 360-degree view in the way that the LEDs rotate and the outside slit array rotates in opposite direction [14]. This method has the benefit in providing high-resolution three-dimensional images. G. Choi proposed a cylindrical volumetric display based on time sequential method, and it is composed of a panoramic high-speed projector and a rotating cylindrical screen with three openings [15]. The user watches continuous parallax views through the opening which are generated by changing the images on the screen. However, these displays are improper to the mixed reality application because the background real screen is obstructed by the diffusing screen. The optical combining technique is required to diffuse the light for the virtual objects and to be transparent for the light from the background real scene. Holographic optical elements (HOE) is suitable for this optical combining technique because its grating works selectively depending on the wavelength of the incident light [16].

In this paper, we implement a cylindrical volumetric display using an asymmetric diffusive holographic optical element (ADHOE) and the ADHOE acts as anisotropic rear projection screen. In particular, this ADHOE screen is not mechanically moved, and the voxels are displayed apart from the screen in time-sequential projection method. Also, due to the wavelength selectivity, virtual objects and real background scenes are watchable together. The ADHOE is fabricated by analyzing the required diffusing angle in consideration of the viewing zone and the crosstalk between adjacent views. A mixed reality display without color dispersion is realized where the proper carrier-frequency is applied to the elemental images, and then the images are projected on the ADHOE by using a high-speed projector and a scanner.

2. View volume in cylindrical volumetric display

Most volumetric displays are implemented based on multi-view projection. The multi-view projection method generates parallax images onto an anisotropic screen with different vertical and horizontal scattering angles [920]. Figure 1 shows the view volume in cylindrical volumetric display. The projectors are placed circularly heading to the rotation axis and display parallax images suitable for the rotation angle on ADHOE as shown in Fig. 1(a). Figure 1(b) shows the view volume of the proposed system, where the FOVs of the projectors are overlapped. In a volumetric display, a voxel is represented by a combination of optical rays from the projector. In this system, the projector is arranged in a circle, so the view volume also has cylindrical shape. The radius of the cylinder is determined by

$${\rho _{vol}} = R\sin \left( {\frac{{{\theta_{FOV}}}}{2}} \right), $$
where ${\theta _{FOV}}$ is the FOV of each projector and R is the radius of the circle on which the projectors are arranged. In our system, the R and ${\theta _{FOV}}$ are set to 90 mm and 19 degree, and the radius of the view volume is 14.9 mm.

 figure: Fig. 1.

Fig. 1. View volume in cylindrical volumetric display. (a) Time-sequentially rotating projection toward rotation axis and (b) resultant view volume.

Download Full Size | PDF

In this system, the view volume is determined by the FOV of each projector. The viewer does not see the entire FOV of one projector at once, but only a portion of it. In other words, the image observed from a given viewpoint is composed of the parts of the FOV of several projectors. Figure 2 shows the rays entering the observer's eye when two projectors are placed by an interval of $\Delta \phi $. The bold green and blue lines indicate the incident rays from each projector. On the other hand, the dotted lines are rays diffused by the screen. In the case that the screen has too small diffusing angle, the observer sees dark regions between two projectors. On the other hand, in the case that the diffusing angle of the screen is too large, the images from the two projectors overlap each other and crosstalk effect occurs. Therefore, it is important to properly adjust the diffusing angle of the screen [21]. The diffusing angle at which the two views generated by the two projectors barely overlap is obtained as

$${\omega _h} = \frac{R}{{R + r}}\Delta \phi . $$

Here, R is the distance between the rotation axis and the time-multiplexed projector, and r is the distance between the rotation axis and the HOE. In this paper, 512 virtual projectors are applied for the proposed volumetric display and $\Delta \phi $ is 0.7 degree. In addition, R and r are set to 90 mm and 60 mm, respectively. Therefore, the required horizontal diffusing angle, ${\omega _h}$ becomes 0.4 degree.

 figure: Fig. 2.

Fig. 2. Optimal condition for horizontal diffusing angle where dark region is removed without crosstalk.

Download Full Size | PDF

It is difficult to mechanically rotate the projector along a circle. In our system, a scanning mirror is placed on the top of the cylinder to fold the optical path as shown in Fig. 3. Here, the central axis of the cylinder and the optical path coincide. By rotating the scanning mirror, the time sequentially rotating projector is implemented. The incident angle on the screen is given by

$${\theta _{inc}} = \arctan \left( {\frac{{H - z}}{{R + r}}} \right). $$

Since the virtual projector is located above the cylinder, the incident light from the projector on the screen is directed toward the ground. So, it is necessary to fold the light in the direction of the viewer's position and diffuse the light vertically. This directional diffusing characteristic is possible to be made of photopolymer. In the case that the diffusing angle is too small, the upper and lower parts of the image may be vignetted when the distance of the observer is close. On the other hand, in the case that the diffusing angle is too large, most optical power is waisted as stray light. For a given position of the viewer, the required vertical diffusing angle is given by

$${\omega _v} = 2{\tan ^{ - 1}}\left[ {\frac{h}{{2({l - r} )}}} \right].$$

Here, l is a viewing distance. In this paper, the vertical diffusing angle, ${\omega _v}$ is 18.9 degree when l and h are set 300 mm and 80 mm, respectively.

 figure: Fig. 3.

Fig. 3. Front-view of the system and vertical diffusing angle to determine viewing zone.

Download Full Size | PDF

3. Recording cylindrical holographic diffusive combiner

The HOE has the advantage of selectively diffracting light to desired direction for the only specific wavelength, and it is suitable for application as a transparent screen for mixed reality [16]. In our system ADHOE made of photopolymer is attached to the transparent pipe and this cylindrical screen diffracts the incident beam with anisotropic diffusing angles. Since the screen has 360-degree cylindrical symmetry with uniform horizontal diffusing angle, it is difficult to directly record the ADHOE in cylindrical shape. Therefore, an optically equivalent condition is invented so that it has the desired characteristics even when the ADHOE is fabricated in flat shape. Figure 4 shows two optically equivalent conditions in cylindrical and flat shapes. As shown in Fig. 4(a), in the display system, the light from the projector positioned above the screen passes through the ADHOE and it is asymmetrically diffused. This situation is understood at that the point sources are placed on a circle with the position of the projector and these functions as reference light for recording the HOE. If the recording setup is changed to a flat shape instead of a cylindrical shape, the point sources are also located on a straight line as shown in Fig. 4(b). In the recording setup, a line source is placed above the planar photopolymer and the light illuminates it obliquely.

 figure: Fig. 4.

Fig. 4. Two optically equivalent setups for recording HOEs in different shapes. (a) HOE in cylinder shape and (b) HOE in flat shape.

Download Full Size | PDF

The optical rays in Fig. 4(b) functions as the reference beam. It is necessary to control the angular spectrum bandwidth of the object beam so that ADHOE has demanded anisotropic diffusing angles. Figure 5 shows the optical layout controlling the angular spectrum bandwidth of object beam. In order to record the HOE with anisotropic diffusion characteristics, the Fourier transform with 2f optics is generally used where the desired angular spectral bandwidth is determined by adjusting the width of the incident object light. Figure 5(a) and (b) show top and side views of the experimental setup, respectively. In the former, the vertical diffusing angle is controlled by the opening of the slit in front of the scatterer. In the latter, the horizontal diffusing angle is determined. In this setup, the reference light is in the form of a line source. The reference light has the form of a diverging point light source for recording the vertical scattering angle as shown in Fig. 5(a). On the other hand, for recording the horizontal scattering angle, the reference light source has the shape of a line segment as shown in Fig. 5(b).

 figure: Fig. 5.

Fig. 5. Optical layout for recording ADHOE with control of diffusing angles in (a) vertical direction and (b) horizontal direction independently.

Download Full Size | PDF

In the 2f optics, two optical fields at scatterer plane $({\xi ,\eta } )$ and ADHOE plane $({x,y} )$ are related each other as Fourier transform [22]. The optical field of the object beam is represented by

$${u_o}({\xi ,\eta } )= \int\!\!\!\int {{U_o}({x,y} )} \exp \left[ {i\frac{{2\pi }}{{\lambda f}}({\xi x + \eta y} )} \right]dxdy.$$

Here, ${U_o}({x,y} )$ is the optical field of the object beam at ADHOE plane. f is a focal length of a Fourier transform lens and $\lambda $ is the wavelength of the light source. Under the paraxial approximation, the vertical and horizontal sizes of the incident object light are determined by

$$\Delta \xi = f{\omega _v},$$
$$\Delta \eta = f{\omega _h}.$$

In 2f optics, the maximum space-bandwidth product (SBP) that can be transmitted from the lens is determined by [23,24]

$$SB{P_{lens}} = \frac{{{D^2}}}{{4\lambda f}}.$$

Here, D is the diameter of the lens and it is twice the required space in the input plane. In this paper, the diffusing angle of ADHOE is anisotropic, and the diffusing angle in the x-direction is larger than the angle in the y-direction. So, the minimum diameter of the lens is determined by the SBP of the ADHOE in x-direction. In the experimental setup for recording the ADHOE, the diameter of the lens is 250 mm and the focal length is 254 mm. So, the maximum values of the vertical size of the incident object light, $\Delta \xi $ and the vertical diffusing angle of the ADHOE, ${\omega _v}$ are 125 mm and 28.2 degree, respectively. The lens has sufficient SBP to record the ADHOE required for our system.

Figure 6 shows the layout of recording setup, where laser light sources with wavelengths of 638 nm and 520 nm are used, and a dichroic mirror is used to combine two optical beams. Then, the light is divided into a reference arm and an object arm, and the intensity ratio is controllable by an attenuator. At the reference arm, the spatial filter eliminates the noise of the light, then the light is collimated by the subsequent lens. After that, two folding mirrors make the reference arm aim to the photopolymer at a desirable angle, and the cylindrical lens, CL1, is used to generate the required line source. At the object arm, the spatial filter is also inserted to eliminate the noise of the light and it makes a point light source. The light is condensed vertically to the next scattering plate by cylindrical lens, CL2, since the light outside of the required size is cut off. So, this cylindrical lens is helpful to lessen the optical power loss in the recording setup. The diffusing angle of the object beam is determined after passing through the 2f optical Fourier transform lens. The object beam is interfered with the reference beam at the photopolymer to make the ADHOE with designed size and diffusing angle. A mask is used to define the area to be recorded on the photopolymer at once. Since the aperture size of the mask in vertical direction is too small to record the full size of the ADHOE for the proposed volumetric display. So, the photopolymer is shifted with a linear stage and partial region is sequentially recorded. Photopolymer of Bayfol HX200 is used for ADHOE, where its recording dosage is 30${{mJ} / {c{m^2}}}$. The total exposure densities of the two beams are 705${{\mu W} / {c{m^2}}}$ for 638 nm and 988 ${{\mu W} / {c{m^2}}}$ for 520 nm, and the exposure time is 21 seconds.

 figure: Fig. 6.

Fig. 6. (a) Layout and (b) photo of recording setup.

Download Full Size | PDF

Figure 7 shows experimental setup to measure the diffusing angle of the ADHOE and the measured properties. The probe beam is collimated and has the same incident angle as the reference beam. For this measurement, the ADHOEs are recorded under the different conditions changing the width of incident object light. Radiance intensity is measured by varying the observation angle of the power meter. Figure 7(b) shows the relative radiance intensity measured at the observation angle, where the relative radiance intensity is the value that normalizes the total power of the scattered beam. Here, the diffusing angle of the ADHOE is the full width half maximum (FWHM) of relative radiance intensity. Figure 7(c) shows diffusing angle of ADHOEs under different recording conditions. Compared with the theoretical values, experimental results are in good agreement.

 figure: Fig. 7.

Fig. 7. (a) Setup to measure the diffusing angle of the ADHOE. (b) Relative radian intensity and (b) diffusing angle of seven ADHOE recorded under the different vertical sizes of the incident object light, $\Delta \xi $ in the recording setup.

Download Full Size | PDF

4. Experimental results

A 360-degree see-through display with the ADHOE is implemented where the ADHOE needs large vertical diffusing angle to get the reasonable size of viewing zone but small horizontal diffusing angle to get rid of the dark region without the crosstalk. Figure 8 shows the structure and the photo of the implemented system. The system is physically divided into the upper part and the lower part. The upper part consists of the ADHOE attached to an acrylic tube and a scanning mirror. Here, the diffusing angle of the ADHOE is 0.7 degree × 17.8 degree and its area is 80 mm × 380 mm. The lower part consists of the laser sources, the DMD, and the projection lens. The system uses a single DMD and it is operated alternatively to display color images with red and green light sources. The lasers with the wavelengths of 520 nm and 638 nm are used as light sources. These lasers are coupled to optical fibers and combined with a wavelength division multiplexer. The combined light source is collimated and diffracted by the DMD. The displayed image on DMD is projected on the ADHOE with a projection lens and a rotating scanning mirror. Because the projection surface is tilted and curved, the projected image has key stone error and severe radial distortion. So, the distortion is compensated by pre-warping the displayed image. The DMD and the light source are synchronized to the rotation of the scanning mirror, and the DMD displays the different elemental images according to the rotation angle of the scanning mirror. The DMD is required to be operated by high-speed switching rate. Our system displays 512 images per one revolution of the scanning mirror that rotates at a speed of 600 rpm. So, a switching rate of 5,120 Hz for 2-bit image is required. The DMD used in our system is V-7001 manufactured by Vialux and its maximum switching speed of 9,174 Hz is large enough to display 2-bit images.

 figure: Fig. 8.

Fig. 8. (a) Structure and (b) photo of 360-degree mixed reality volumetric display

Download Full Size | PDF

In most cases, a light source for the DMD is incoherent and the diffraction effect is regarded to bring only minor impact. Each micro-mirror in the DMD is designed to be individually tilted by +12 degree in the on-state or -12 degree in the off-state. So, the incident beam is tilted by 24 degree in order that the reflected beam aims in the normal direction of the DMD when the DMD is on the state. However, in our system, the lasers are used as the light sources. Therefore, the diffraction on the periodic structure of the DMD should be considered [25]. In the case the whole pixels are set on state, the periodic diffraction pattern appears in Fourier domain. The reflection from the tilt of the pixels just makes the envelope of the optical intensity. So, Fourier filtering is very important to avoid the color dispersion of the projection image on the ADHOE. Figure 9 shows the Fourier filter in optical layout and diffraction light pattern on Fourier domain. The Fourier filter is placed at the aperture stop of the projection lens, that is the Fourier domain of the DMD plane. The lens between the Fourier filter and the DMD functions as the Fourier transform lens and its focal length is 40 mm as shown in Fig. 9(b). The incident angle of illumination light source is set 23.3 degree instead of 24 degree. In this condition, the diffraction light with a specific order is aligned normal to the DMD plane at the wavelength of 638 nm. The angles of -0.7 degree and 47.2 degree mean the directions of propagation of the envelopes when the DMD is on-state and off-state, respectively.

 figure: Fig. 9.

Fig. 9. (a) Fourier filter in optical layout and (b) diffraction light pattern on Fourier domain.

Download Full Size | PDF

The system expresses colors by alternatively switching light sources of two wavelengths. Since the incident beam illuminates the DMD along its diagonal direction, the period of the pixel arrangement becomes $\sqrt 2$ times smaller than pixel pitch, p. So, the diffraction angle of the m-th order is determined as

$${\theta _o} = {\sin ^{ - 1}}\left( { - \sin {\theta_i} + {{\sqrt 2 m\lambda } / p}} \right), $$
where, ${\theta _i}$ is an incident angle and $\lambda$ is the wavelength. In our system, the incident angle is set to 23.3 degree so that the diffraction light of the sixth order comes out normal to the DMD at the wavelength of 638 nm. On the other hands, the diffraction light of the seventh order comes out obliquely by -1.1 degree at the wavelength of 520 nm.

Figure 10 shows the simulation of the diffraction pattern of the DMD where the diffraction order at 638 nm and 520 nm are colored by red and green, respectively. The rectangles are Nyquist regions and their sizes are different depending on the wavelength. The difference between the diffraction angles of the sixth order at 638 nm, ${R_{6,6}}$ and the seventh order at 520 nm, ${G_{7,7}}$ is -1.1 degree and the distance between them is 0.77 mm on Fourier domain. Therefore, two Nyquist regions of red and green wavelengths are slightly mismatched as shown in Fig. 10(a). This mismatch results in color dispersion of the projection image on the ADHOE since the angular distributions of red and green light are not identical to each other. To solve this color dispersion problem, a carrier frequency is applied to the elemental image uploaded on the DMD. Figure 10(b) shows the simulation result with the carrier frequency of 36.1 lp/mm applied to the green pattern. The diffraction peak point of ${G_{7,7}}$ is shifted exactly to the position of the diffraction peak point of ${R_{6,6}}$ and this shifted peak point is named ${G^{\prime}_{7,7}}$. The Fourier filter makes both points of ${R_{6,6}}$ and ${G^{\prime}_{7,7}}$ passes for color matching.

 figure: Fig. 10.

Fig. 10. Simulation of diffraction pattern on the Fourier domain. Diffraction patterns when the carrier frequency (a) is not applied and (b) is applied.

Download Full Size | PDF

Figure 11 shows captured images of a checkered cylinder where this yellow checkered pattern is composed of red and green images with 638 nm and 520 nm, respectively. The color dispersion on the edge of the checkered pattern is observed as shown in Fig. 11(a) when the carrier frequency is not applied. On the other hand, this color dispersion disappears as shown in Fig. 11(b) when the carrier frequency is applied to the only green image,

 figure: Fig. 11.

Fig. 11. Captured images of a checkered cylinder (a) without and (b) with a carrier frequency.

Download Full Size | PDF

Figure 12 shows the experimental results of the virtual contents of the Christmas ornament with different viewing directions; (a) 0 degree (b) 90 degree (c) 180 degree, and (d) 270 degree. Here, the visualization is taken around the system while changing the position of the camera. Since the virtual contents are composed of 512 viewpoint images, 360-degree view is observed naturally with negligible discontinuity. In addition, the surrounding real background scene is watched through the ADHOE.

 figure: Fig. 12.

Fig. 12. Captured images of Christmas ornament with different viewing directions; (a) 0 degree (b) 90 degree (c) 180 degree, and (d) 270 degree. (see Visualization 1)

Download Full Size | PDF

Figure 13 shows the captured images when focused at the virtual contents and at the real background scene. The background scene looks blurred when the camera is focused on the virtual contents, and the virtual contents become blurred when the camera is focused on the background scene. The observer perceives that the virtual object is positioned inside the cylinder. Therefore, it is experimentally confirmed that the virtual contents are naturally mixed with the real scene.

 figure: Fig. 13.

Fig. 13. Captured images of Christmas ornament when focused (a) at the virtual contents and (b) at the real background scene (see Visualization 2)

Download Full Size | PDF

In our display, the diffraction efficiency is not the same over entire image. The ADHOE is recorded under the condition that the horizontal direction of both reference light and object light are perpendicular to the photopolymer. When the light forming the image on the ADHOE is seen, the light from the center part of the image has zero diffraction angle in the horizontal direction, but the light from the side part of the image needs to be diffracted in the horizontal direction to be seen by the viewer. It is well known that the diffraction efficiency is reduced when the incident angle of the reference light in the reconstruction condition is not the same as that in the recording condition. In our display, the incident angle of reference light ranges from -14.3 degree to 14.3 degree in horizontal direction. The diffraction efficiency ratio of 14.3-degree and 0-degree incident angles is 0.592. This value is not trivial and the brightness of image is pre-tunned to compensate the deviation of the diffraction efficiency.

5. Conclusion

In this paper, a new design of a 360-degree mixed reality volumetric display is proposed. This design applies asymmetric diffusive holographic optical element (ADHOE) to display a mixed reality by using wavelength selectivity of the ADHOE. The high-speed projector and scanning mirror generate cylindrical view volume for horizontal parallax with 512 viewpoints per a revolution. In this system, the ADHOE needs large vertical diffusing angle to get the reasonable size of viewing zone but small horizontal diffusing angle to get rid of the dark region without the crosstalk. To fabricate the ADHOE, 2f optics is used where the angular spectral bandwidth is determined by adjusting the width of the incident object light. We implement the ADHOE with proper diffusing angles of 0.7 degree in horizontal direction and 17.8 degree in vertical direction. The coherence light sources with two wavelengths and the periodic pixel structure of the DMD cause the mismatch in diffraction peaks, then this mismatch results in the color dispersion. So, the carrier frequency is applied to the green colored image and both diffraction peaks in red and green colors are matched perfectly. A 360-degree see-through display with ADHOE is implemented and its feasibility is verified successfully. In future, we plan to modify the scanner with a free-form mirror so that the view volume is expected to increase.

Funding

Ministry of Science and ICT, South Korea (2020-0-00929).

Acknowledgments

This work was supported by Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No. 2020-0-00929, Development of Authoring Tool for Digital HOE Hologram Generation Using Optical Simulation).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. M. Gately, Y. Zhai, M. Yeary, E. Petrich, and L. Sawalha, “A three-dimensional swept volume display based on led arrays,” IEEE J. Displ. Technol. 7(9), 503–514 (2011). [CrossRef]  

2. K. Kumagai, I. Yamaguchi, and Y. Hayasaki, “Three-dimensionally structured voxels for volumetric display,” Opt. Lett. 43(14), 3341–3344 (2018). [CrossRef]  

3. K. Kumagai, S. Miura, and Y. Hayasaki, “Colour volumetric display based on holographic-laser-excited graphics using drawing space separation,” Sci. Rep. 11(1), 22728–22729 (2021). [CrossRef]  

4. D. E. Smalley, E. Nygaard, K. Squire, J. Van Wagoner, J. Rasmussen, S. Gneiting, K. Qaderi, J. Goodsell, W. Rogers, M. Lindsey, K. Costner, A. Monk, M. Pearson, B. Haymore, and J. Peatross, “A photophoretic-trap volumetric display,” Nature 553(7689), 486–490 (2018). [CrossRef]  

5. B. G. Blundell, “Volumetric 3D displays,” in Handbook of Visual Display Technology (Springer, 2012), Vol. 3, pp. 1917–1929.

6. G. E. Favalora, J. Napoli, D. M. Hall, R. K. Dorval, M. G. Giovinco, M. J. Richmond, and W. S. Chun, “100 million-voxel volumetric display,” Proc. SPIE 4712, 300–312 (2002). [CrossRef]  

7. J.-H. Jung, B.-S. Song, and S.-W. Min, “Depth cube display using depth map,” Proc. SPIE 7863, 78630W (2011). [CrossRef]  

8. R. Asahina, T. Nomoto, T. Yoshida, and Y. Watanabe, “Realistic 3D swept-volume display with hidden-surface removal using physical materials,” Proc. IEEE, 113–121 (2021).

9. O. S. Cossairt, J. Napoli, S. L. Hill, R. K. Dorval, and G. E. Favalora, “Occlusion-Capable Multiview Volumetric Three-Dimensional Display,” Appl. Opt. 46(8), 1244–1250 (2007). [CrossRef]  

10. A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “Rendering for an interactive 360° light field display,” ACM Trans. Graph. 26(3), 40–49 (2007). [CrossRef]  

11. X. Xia, Z. Zheng, X. Liu, H. Li, and C. Yan, “Omnidirectional-view three-dimensional display system based on cylindrical selective-diffusing screen,” Appl. Opt. 49(26), 4915–4920 (2010). [CrossRef]  

12. S. Yoshida, “fVisiOn: 360-degree viewable glasses-free tabletop 3D display composed of conical screen and modular projector arrays,” Opt. Express 24(12), 13194–13203 (2016). [CrossRef]  

13. M. Makiguchi, D. Sakamoto, H. Takada, K. Honda, and T. Ono, “Interactive 360-degree glasses-free tabletop 3D display,” in Proc. ACM Sympo. User Interf. Soft. and Technol., 625–637 (2019).

14. T. Yendo, “The Seelinder: Cylindrical 3D display viewable from 360 degrees,” J. Vis. Commun. Image Representation 21(5-6), 586–594 (2010). [CrossRef]  

15. G. Choi, H. Jeon, H. Kim, and J. Hahn, “Horizontal-parallax-only light-field display with cylindrical symmetry,” Proc. SPIE 10556, 4 (2018). [CrossRef]  

16. N. Kim, Y. L. Piao, and H. Y. Wu, “Holographic Optical Elements and Application,” in Holographic Materials and Optical Systems, (InTech, 2017).

17. Y. Takaki, M. Tokoro, and K. Hirabayashi, “Tiled large-screen three-dimensional display consisting of frameless multi-view display modules,” Opt. Express 22(6), 6210–6221 (2014). [CrossRef]  

18. C. Su, Y. Peng, Q. Zhong, H. Li, R. Wang, W. Heidrich, and X. Liu, “Towards VR and AR: 360° Light Field Display with Mid-air Interaction,” Proc. SIGGRAPH ASIA, 1–2 (2016).

19. M. Kawakita, S. Iwasawa, R. Lopez-Gulliver, and N. Inoue, “Glasses-free large-screen three-dimensional display and super multiview camera for highly realistic communication,” Opt. Eng. 57(06), 1 (2018). [CrossRef]  

20. H. Watanabe, N. Okaichi, T. Omura, M. Kano, H. Sasaki, and M. Kawakita, “Aktina Vision: Full-parallax three-dimensional display with 100 million light rays,” Sci. Rep. 9(1), 17688–17689 (2019). [CrossRef]  

21. A. Jones, K. Nagano, J. Liu, J. Busch, X. Yu, M. Bolas, and P. Debevec, “Interpolating vertical parallax for an autostereoscopic three-dimensional projector array,” J. Electron. Imaging 23(1), 011005 (2014). [CrossRef]  

22. J. W. Goodman, Introduction to Fourier Optics (McGraw-Hill, 1968).

23. H. M. Ozaktas and H. Urey, “Space-bandwidth product of conventional Fourier transforming systems,” Opt. Commun. 104(1-3), 29–31 (1993). [CrossRef]  

24. A. W. Lohmann, R. G. Dorsch, D. Mendlovic, C. Ferreira, and Z. Zalevsky, “Space–bandwidth product of optical signals and systems,” J. Opt. Soc. Am. A 13(3), 470–473 (1996). [CrossRef]  

25. M. Kim, S. Lim, G. Choi, Y. Kim, H. Kim, and J. Hahn, “Expanded exit-pupil holographic head-mounted display with high-speed digital micromirror device,” ETRI J. 40(3), 366–375 (2018). [CrossRef]  

Supplementary Material (2)

NameDescription
Visualization 1       3D contents of Christmas ornament captured by a camera with different viewing directions
Visualization 2       3D contents of Christmas ornament captured by a camera with changing its focus

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (13)

Fig. 1.
Fig. 1. View volume in cylindrical volumetric display. (a) Time-sequentially rotating projection toward rotation axis and (b) resultant view volume.
Fig. 2.
Fig. 2. Optimal condition for horizontal diffusing angle where dark region is removed without crosstalk.
Fig. 3.
Fig. 3. Front-view of the system and vertical diffusing angle to determine viewing zone.
Fig. 4.
Fig. 4. Two optically equivalent setups for recording HOEs in different shapes. (a) HOE in cylinder shape and (b) HOE in flat shape.
Fig. 5.
Fig. 5. Optical layout for recording ADHOE with control of diffusing angles in (a) vertical direction and (b) horizontal direction independently.
Fig. 6.
Fig. 6. (a) Layout and (b) photo of recording setup.
Fig. 7.
Fig. 7. (a) Setup to measure the diffusing angle of the ADHOE. (b) Relative radian intensity and (b) diffusing angle of seven ADHOE recorded under the different vertical sizes of the incident object light, $\Delta \xi $ in the recording setup.
Fig. 8.
Fig. 8. (a) Structure and (b) photo of 360-degree mixed reality volumetric display
Fig. 9.
Fig. 9. (a) Fourier filter in optical layout and (b) diffraction light pattern on Fourier domain.
Fig. 10.
Fig. 10. Simulation of diffraction pattern on the Fourier domain. Diffraction patterns when the carrier frequency (a) is not applied and (b) is applied.
Fig. 11.
Fig. 11. Captured images of a checkered cylinder (a) without and (b) with a carrier frequency.
Fig. 12.
Fig. 12. Captured images of Christmas ornament with different viewing directions; (a) 0 degree (b) 90 degree (c) 180 degree, and (d) 270 degree. (see Visualization 1)
Fig. 13.
Fig. 13. Captured images of Christmas ornament when focused (a) at the virtual contents and (b) at the real background scene (see Visualization 2)

Equations (9)

Equations on this page are rendered with MathJax. Learn more.

ρ v o l = R sin ( θ F O V 2 ) ,
ω h = R R + r Δ ϕ .
θ i n c = arctan ( H z R + r ) .
ω v = 2 tan 1 [ h 2 ( l r ) ] .
u o ( ξ , η ) = U o ( x , y ) exp [ i 2 π λ f ( ξ x + η y ) ] d x d y .
Δ ξ = f ω v ,
Δ η = f ω h .
S B P l e n s = D 2 4 λ f .
θ o = sin 1 ( sin θ i + 2 m λ / p ) ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.