Abstract

A compact see-through three-dimensional head-mounted display (3D-HMD) is proposed and investigated in this paper. Two phase holograms are analytically extracted from the object wavefront and uploaded on different zones of the spatial light modulator (SLM). A holographic grating is further used as the frequency filter to couple the separated holograms together for wavefront modulation. The developed preliminary prototype has a simple optical facility and a compact structure (133.8mm × 40.4mm × 35.4mm with a 47.7mm length viewing accessory). Optical experiments demonstrated that the proposed system can present 3D images to the human eye with full depth cues. Therefore, it is free of the accommodation-vergence conflict and visual fatigue problem. The dynamic display ability is also tested in the experiments, which provides a promising potential for the true 3D interactive display.

© 2017 Optical Society of America

1. Introduction

Augmented reality (AR) technique presents virtual signals superimposed to the viewer’s real world scenes. This new observation experience allows the AR technique being applied in various fields, such as navigation, biomedicine, entertainment and military operation [1,2]. Among the existing AR systems, the see-through head-mounted display (HMD) is mostly used for compact and portable design. However, in the conventional see-through HMDs, the micro displays only deliver two-dimensional (2D) images to the human eyes, and the stereoscopic three-dimensional (3D) vision is mainly based on the binocular parallax [3–6]. When wearing such HMDs, the viewer’s eyes focus on the plane of the presented 2D image, while the optical axes of the eyes are converged to the simulated 3D depths created by the binocular disparity [5]. Therefore, the eyes focusing distance and the converging distance are different in those conventional see-through HMDs. This distance mismatch produces the accommodation-vergence (A-V) conflict problem, which may cause the visual discomfort (eye fatigue, eye irritation, blurry vision, headache, nausea, etc.) to the viewer [6].

In recent years, the 3D-HMDs with sufficient depth cues to prevent the A-V conflict have been attracting increasingly attention. Some methods beyond the binocular parallax have been introduced to the 3D-HMD design, such as integral imaging (II) [7–10], multi-focal display [11], holographic optical element (HOE) [12–15], and computer-generated holography (CGH) [16–19]. Hong et al. proposed an integral floating system with a concave half mirror to obtain the AR performance [7]. Hua et al. combined the microscopic integral imaging and the free form optical lens [8]. Although the II can present multi-angle images for 3D display, its modulated depth range is in conflict with the lateral resolution because of the limitation of the II theory. As for the multi-focal display, Lee et al. proposed a compact 3D-HMD which used the birefringence property of the Savart plat to produce double projection planes [11]. The 3D effect is realized by adopting the depth-fused display or compressive light field display. But it is difficult to modulate the focused depth cues continuously in a wide range. Due to the Bragg matched condition, the HOE coupler has good see-through effect for the AR application. It can be combined with the II technique. Hong et al. fabricated the HOE with micro lens-array function and designed a 3D see-through AR system [12]. Yeom et al. developed a 3D-HMD with two HOEs as input and output couplers [13]. They also studied the astigmatism aberration of the waveguide propagation in the glass substrate. Li et al. employed two exposure steps to record the HOE of mirror and Fourier lens function for see-through AR display [14]. However, the fabrication of high quality and large size HOE is still a big challenge. Since the holographic display can present the 3D object in the natural viewing way, it is a promising technique to design the 3D-HMD without the A-V conflict. Then the holography-based 3D-HMD began to draw much interest of the researchers. In such systems, the traditional micro display is replaced by the spatial light modulators (SLMs). Moon et al. utilized the RGB light-emitting diode (LED) light source and CGH to fulfill colorful HMD design [16]. Chen et al. improved the layer-based CGH method and then built their HMD [17]. Although the existing holographic 3D-HMD systems have achieved much improvement, there still remain some problems to be overcome: (1) phase-only SLM modulation drags the reconstructed images quality because of the information-loss in the holograms encoding process; (2) iteration operation costs much time, so it is hard to achieve the dynamic display for interactive AR performance.

In our previous work, the complex amplitude modulation (CAM) technique was firstly introduced to the design of the see-through 3D-HMD [20]. Compared with the phase-only and iteration method, the CAM uses both amplitude and phase information to reconstruct the object wavefront. Thus, the image quality can be ensured with high quality. Besides, it is free of iterations, which also gives the real-time modulation ability. The prior experimental results showed that the CAM-based system can reconstruct the complex wavefront of the 3D object and present high-quality 3D images in real-time with full depth cues. It has been proved to be quite suitable for the true 3D interactive AR display. However, the previous system employed two amplitude SLMs to accomplish the CAM modulation. So the optical structure is not simple and the precise alignment is inevitable. Besides, the amplitude mode hologram also decreases the energy efficiency.

Here in this paper, we further propose and investigate a see-through 3D-HMD, which has the simplified optical facility and compact system structure. The method for 3D images reconstruction is based on the wavefront modulation with holographic grating filter. Two phase distributions are analytically extracted from the object wavefront and then uploaded to different zones of the SLM. The holographic grating performs as a frequency filter to couple them together to rebuild the complex object wavefront in real-time. The proposed 3D-HMD inherits the CAM modulation as our previous system. Thus, it can present true 3D signals with sufficient depth cues to eliminate the A-V conflict. The iteration is also avoided in the designed system, so the real-time reconstruction offers the potential interactive display. Besides, the new proposed system uses the holographic grating, 4-f lens system and one phase SLM. Therefore, the optical structure can be easily simplified in high stability for wearable design. The phase hologram also improves the energy efficiency.

2. System and principle

For a HMD system, if one would like to realize the 3D wearable and dynamic display, on one hand, the system should be compact, light, and efficient; on the other hand, the 3D image should be real-time, continuously adjustable, and highly accurate.

The complex wavefront contains all depth cues of a 3D object. Thus, those methods for the wavefront modulation can be used to design the 3D-HMD system without A-V conflict. In addition, the property of no iteration operation also ensures the dynamic display, which is an essential requirement in the AR applications. So far, many wavefront modulation methods have been reported in the literature, and they can be mainly classified into several categories, such as two holograms interference [21–24], superpixel [25,26], cascaded holograms [27–29], grating filter [30–33], birefringence crystal coupling [34], modified iteration algorithms [35,36], optical holography-referred method [37,38], anamorphic optical transformation [39], etc.. Among them, the two holograms interference, cascaded holograms and modified iteration algorithms need at least two SLMs to obtain the complex wavefront, so their optical structures are too complicated and the alignment is hard to be achieved. The superpixel and optical holography-referred methods use extra filtering system. It goes against the compact design goal as well. The birefringence crystal coupling way requires the special crystal, thus the system has high price, bulky volume and heavy weight. The anamorphic optical transformation method utilized the amplitude holograms, thus it drags the energy efficiency. However, the grating filter method only uses a grating and a 4-f lens system to achieve the complex wavefront reconstruction. The optical structure can be easily simplified and optimized for a compact design. Therefore, we apply the grating filter method to design the see-through 3D-HMD system.

2.1 The designed see-through 3D head-mounted display

The schematic of the designed see-through 3D-HMD is shown in Fig. 1(a). The laser beam illuminates the SLM through a beam splitter (BS), and then passes to the 4-f system constituted by two lenses L1 and L2. A grating (G) is put at the back focus of L1 as the frequency filter. The mirror (M) is used to reflect the beam to another BS which combines the modulated signals with the viewer’s real word scene for the see-through function. The SLM is set to be 1920 × 1080 pixels and the pixel pitch is 8μm, so the modulation panel is 15.36mm × 8.64mm. Based on this fundamental data, the other optical elements are selected to match the SLM well and the size of the 3D-HMD is just 133.8mm × 40.4mm × 35.4mm, with a 47.7mm length observation accessory. As Fig. 1(a) shows, the proposed system has a simple optical structure, thus it can be designed in compact for wearable application. Figure 2(b) presents the viewing effect for the AR application. When wearing the designed 3D-HMD, the wearer can see additional 3D instruction signals for different real objects located at different distances of d1 and d2. As the wearer focusing from the Rubic cube to the car, the relevant instructions of ‘Rubic’ and ‘Car’ become clear in turn. Because the modulated signals are true 3D with sufficient depth cues, the wearer will be free of the A-V conflict problem.

 figure: Fig. 1

Fig. 1 (a) Schematic of the proposed see-through 3D-HMD system, (b) the wearing effect for the AR application. The two real objects ‘Car’ and ‘Rubic’ locate at d1 and d2 distances with the modulated 3D instruction signals, respectively.

Download Full Size | PPT Slide | PDF

 figure: Fig. 2

Fig. 2 Illustration of the wavefront modulation with grating filter.

Download Full Size | PPT Slide | PDF

2.2 The wavefront modulation with grating filter

The grating filter method uses a grating (amplitude or phase) as an addition-filter in the frequency domain [30], and Fig. 2 shows its detailed system schematic. Two Fourier lens (FL1 and FL2) constitute a 4-f lens system, and their focuses are both f. On the input plane, double holograms are uploaded to the different zones of the SLM with a separated distance of dh. The grating is put at the back focus plane of FL1. If the grating period and dh satisfy a certain condition, the complex field can be obtained at the output plane of the 4-f lens system. The reconstructed complex wavefront propagates a zo distance, and then the object can be imaged with a high quality.

The process can be further explained mathematically in detail. At beginning, we assume the target object wavefront is Aexp() with A and θ representing the amplitude and phase respectively. The two holograms can be calculated by dividing Aexp() into the real and imaginary distributions or into two pure phase terms. Here in this paper, we adopt the latter approach, because it has the higher diffraction efficiency and the simpler facility. The division can be expressed as the equation below:

exp(iθ1)+exp(iθ2)=Aexp(iθ)
where θ1 and θ2 are the two phase holograms, and i = is the imaginary number. After some mathematically deduction, Eq. (1) can be solved for θ1 and θ2:
{θ1=θ+cos1(A2)θ2=θcos1(A2)
where cos−1 represents the arccosine function.

The input plane of the 4-f lens system is the composition of the two holograms. It propagates to the FL1 back focus and multiplies the grating. This process can be described by the following equation:

U(xg,yg)=1iλfFt{h1(xs,ysdh2)+h2(xs,ys+dh2)}G(xg,yg)=1iλfH(xgλf,ygλf)G(xg,yg)
where (xs, ys) and (xg, yg) stands for the coordinate of the SLM plane and the grating plane, respectively, Ft is the Fourier transform, h1 and h2 are the two phase holograms separated with a dh distance and H is the holograms frequency spectrum, λ is the light wavelength, and G represents the grating. For the sake of briefly description, we further assume the grating is an amplitude cosine grating, which is given by:
G(xg,yg)=12+m2cos(2πygΔ)
where m is the modulation depth of the grating, and Δ denotes the grating period. The aperture function of the grating is also omitted for briefness. The product of Eq. (3) diffracts to the back focus of FL2 subsequently, which is described by the formula bellow:
U(xc,yc)=1iλfFt{1iλfH(xgλf,ygλf)[12+m2cos(2πygΔ)]}=[h1(xc,ycdh2)+h2(xc,yc+dh2)]+m4[h1(xc,ycdh2+λfΔ)+h2(xc,yc+dh2λfΔ)]+m4[h1(xc,ycdh2λfΔ)+h2(xc,yc+dh2+λfΔ)]
where (xc, yc) represents the coordinate of the output plane. In Eq. (5), if the holograms separated distance dh and the grating period Δ satisfy the condition:
dh=2πfΔ
the Eq. (5) can be rewritten as Eq. (2):
U(xc,yc)=...+m4[h1(xc,yc)+h2(xc,yc)]+...=...+m4[exp(iθ1)+exp(iθ2)]+...=...+m4Aexp(iθ)+...
where the second term is the reconstructed target wavefront Aexp(). The other terms are omitted for the sake of simplicity. Then the reconstructed Aexp() propagates a z0 distance, and the object distribution O(x, y) can be written by the Fresnel diffraction [40]:
O(x,y)=1iλzoexp(ikzo)U(xc,yc)exp{ik2zo[(xxc)2+(yyc)2]}dxcdyc
where k = 2π/λ is the wave number.

2.3 Fabrication of the holographic grating

The cosine grating is a key element in the proposed 3D-HMD system. It is employed as a frequency filter to couple the separated holograms. There are several ways to fabricate the grating, but here in this paper we apply the holographic grating, because of its simple production process. Figure 3(a) shows the illustration of the interference fabrication process. Two plane-wave light beams illuminate the glass substrate which is smeared by silver halide emulsion. The angle of the two beams is β, and their wave vectors are k1 and k2, respectively. The interference exposure can be expressed by:

I=|exp(ik1r)+exp(ik2r)|2=2+2cos[(k1k2)r]
where I is the intensity, and r denotes the coordinate of the light beams. The grating period is the distance of the adjacent fringe peaks. It is decided by:

 figure: Fig. 3

Fig. 3 (a) Illustration of the holographic grating fabrication, (b) a fabricated example.

Download Full Size | PPT Slide | PDF

Δ=λ2sin(β2)

From Eqs. (6) and (10), we can see that the holograms separated distance dh, grating period Δ and the angle of the interference beams β are connected together. Therefore, it supplies freedoms for us to optimize the proposed system. Figure 3(b) shows a fabricated cosine grating example. Its parameters are Δ = 0.01mm, λ = 532.8nm and β = 3.1°.

3. Optical experiments and discussions

3.1 Prototype of the designed see-through 3D head-mounted display

The monocular mode prototype of the designed 3D-HMD is developed by the aid of 3D printer technology. Its fabrication material is polylactic acid plastic. As shown in the Fig. 4(a), the size of the fabricated prototype is 133.8mm × 40.4mm × 35.4mm with a 47.7mm length viewing accessory. The used SLM is Holoeye Pluto with 8μm pixel pitch and 1920 × 1080 pixels, and the phase modulation range is [0, 2π]. The other optical elements are off-the-shelf components and they are assembled in detail as Fig. 4(a). The SLM is placed closely to the beam splitter (BS), and the size of the BS is 25.4mm × 25.4mm. Two optical lenses (L1 and L2) constitute the 4-f system. Their focal lengths are both 45mm and the fabricated grating (G) is arranged at the back focus of L1. Another BS is utilized as the viewing window. As Fig. 4(b) shows, the exit pupil is lz = 10mm, and the eyebox is 20.1mm × 20.1mm. The module weight including the inner optics is 196.3g, so it is compact for the wearable employment. Figure 4(c) presents the wearing performance of the designed monocular 3D-HMD. Its simple optical structure provides convenient modulation for displaying true 3D images to eliminate the A-V conflict problem.

 figure: Fig. 4

Fig. 4 The Developed 3D-HMD prototype, (a) its detailed configuration, (b) the assembled module, and (c) the wearing performance.

Download Full Size | PPT Slide | PDF

Optical experiments are performed to test the developed 3D-HMD prototype. Its testing experimental facility is shown in the Fig. 5. In the experiments, the illuminating beam wavelength is 532.8nm from a green laser with 0.6nm spectral width to ensure the high coherency, and the module is fixed on a platform by two holders. To obtain a better displaying effect, a band-pass filter is introduced in the optical path behind the grating. Since the focal length of L1 is 45mm and the SLM modulation panel is 15.36mm × 8.64mm, the grating period Δ is 50μm according to the Eqs. (6) and (10) for better modulation. Two real object O1 (sunflower) and O2 (monkey) is respectively located at d1 and d2 to check the see-through AR performance of the 3D-HMD prototype in multiple depth of cues. Since the exit pupil lz is 10mm and the viewing window is 20.1mm × 20.1mm, the field of view (FOV) is 31.7°(vertical, V) × 31.7°(horizontal, H) for real world scene. A Canon D5 camera with a focus zoom range from 16 cm to 35cm is put at the exit pupil to record the modulated 3D images.

 figure: Fig. 5

Fig. 5 The testing experimental facility with two real objects (sunflower and monkey).

Download Full Size | PPT Slide | PDF

The two holograms are uploaded to the SLM with a separated distance of dh. According to Eq. (6), the calculated dh is about 9.59mm (1199 SLM pixels). Because of some assembly errors, dh needs to be calibrated in the real application and the final amended dh is 9.408mm (1176 SLM pixels). Since the SLM panel is 1920 × 1080 pixels, the size of the displayed zone is set to 512 × 1024 pixels (4.096mm × 8.192mm). The explanation of system FOV is illustrated in Fig. 6 (a) and a hologram example is given in Fig. 6(b). In Fig. 6(a), the FOVr represents the FOV for the real world scene. The FOVs is the FOV for the 3D viewing effect, which is decided by the pixel size of the SLM. It is calculated by FOVs = ± arcsin(λ/2p) = ± 1.9°, where p = 8μm is the pixel pitch. It should be pointed out that the FOVs is still small for the 3D viewing. Since the FOVs is only ± 1.9°, it cannot fully cover the whole FOVr. Smaller pixel pitch SLM can be further applied to enlarge the FOV of 3D images. If the pixel size reaches to 1.0μm in the future design, the FOVs will be enlarged to ± 15°. So this problem can be avoided for the wide FOV display.

 figure: Fig. 6

Fig. 6 (a) Illustration of the system FOV, the exit pupil and the distance of the virtual SLM are both 10mm, the BS coupler is 25.4mm, and the viewing window is 20.1mm, (b) the example of a calibrated hologram.

Download Full Size | PPT Slide | PDF

The detailed parameters of the prototype is collected and listed in the Table 1. They are the premised conditions for the next optical experiments.

Tables Icon

Table 1. Parameters of the prototype.

3.2 Experimental results

Firstly, we tested the prototype on 3D images reconstruction. Two charactors ‘3′ and ‘D’ are used as the target 3D images at different depths. Their distances to the SLM plane are d1 = 134mm and d2 = 184mm, respectively. Figure 7 shows the reconstructed images of the target complex wavefront at different focused depths. The charactors ‘3′ and ‘D’ are focused and displayed in succession at the depths of 134mm and 184mm, respectively. Meanwhile, two real objects (sunflower and monkey) are also placed at the same depths with ‘3′ and ‘D’, respectively. From Figs. 7(a) and 7(b), we can see that the reconstructed images are focused and blurred in the same way with the real objects at different depths. The experimental results demonstrated the see-through 3D display nature of the proposed system. Therefore, the prototype can avoid the A-V conflict and the visual fatigue problem.

 figure: Fig. 7

Fig. 7 Reconstructed 3D images at different focused depths, (a) 3, d1 = 134mm, and (b) D, d2 = 184mm.

Download Full Size | PPT Slide | PDF

Then we further test the prototype on continuous depths expression. As Fig. 8(a) shows, five letters ‘d’, ‘e’, ‘p’, ‘t’ and ‘h’ are used as the displayed 3D images located successively from 134mm to 214mm with 20mm interval. The two real objects are also added in the observation scenes to better show the see-through 3D performance. They are placed at the depths of 134mm and 184mm, respectively. Along with the camera focus zooming, the reconstructed complex field is focused from letters ‘d’ to ‘h’ sequentially. At the same time, from Figs. 8(b)-8(f), it can be seen that the ‘sunflower’ turns to be more and more obscure, while the ‘monkey’ becomes clear and then obscure. These results demonstrated that the proposed system can present continuous depth signals. It is shown that the adjustable depths cover from 134mm to 214mm, as matching with the camera zooming range. Technically to say, the modulation depth range is mostly decided by the coherence length of the used laser source. The more monochromaticity the source is, the wider depth range can be achieved. The spectral line-width of a typical semiconductor laser is only several nanometers, thus the coherence length may reach dozens meters. This depth range is quite sufficient for the human vision. The above zooming process has been checked in real-time and recorded in a video, which is attached as Visualization 1. From the video, it can be seen that the reconstructed signals have continuous depths.

 figure: Fig. 8

Fig. 8 Focused images at different depths (Visualization 1), (a) spatial distribution of the 3D signals with 20mm intervals, and (b)-(f) are the focused images at 134mm, 154mm, 174mm, 194mm and 214mm, respectively.

Download Full Size | PPT Slide | PDF

Since the wavefront modulation method is free of iteration, the holograms can be calculated in real-time. It enables the designed prototype has the dynamic 3D see-through display ability. Optical experiments are performed to test its dynamic display effect. The dynamic 3D signal is a rotate dice. When rotating, the corresponding holograms are calculated at the same time. We set the refresh frame rate is 24fps, respect to the rate of the SLM (60fps). The reconstructed distance is set to be 144mm. For better presenting the see-through effect, a real dice object is also placed at the same focused depth of the 3D signal. The dynamic scenes are recorded by the camera. Six frames are extracted from the video (Visualization 2) and shown in Figs. 9(a)-9(f). We can see that the rotating dice is reconstructed in a good quality in spite of some background noise which is mainly produced by the random phases. The proposed system is proved to be quite suitable for interactive display devices.

 figure: Fig. 9

Fig. 9 Dynamic 3D see-through display (Visualization 2), (a)-(f) are the extracted frames.

Download Full Size | PPT Slide | PDF

3.3 Discussions

The tested results showed that the prototype can present 3D images with continue depth cues, which is the essential condition to eliminate the A-V conflict. But the system FOV is still small, especially for the 3D images (FOV = ± arcsin(λ/2p) = ± 1.9°), and this problem impacts the 3D viewing perception in a certain degree. Actually, decreasing the size of the SLM pixel is able to enlarge the system FOV. If the pixel size reduces to about 1μm, the 3D FOV can be increased to ± 15°. We can also use different focuses lenses of the 4-f system to obtain a suitable image magnification, and the FOV may be enlarged furthermore. Besides, the clarity of the 3D images still needs to be improved. Several possible reasons may lead to this problem: first, though the holographic grating produces duplications for coupling together to achieve wavefront reconstruction, it also brings other unwanted diffraction orders and influences the image quality; second, the random phase is used in the holograms production to enhance the depth effect, but it will cause some computing speckle noise; third, the lens aberration of the 4-f system drags the imaging quality as well; at last, the surrounding dust is inevitable during the recording of the 3D signals.

Considering the potential AR applications, the interaction is one of the most important aspects, where the signal generation rate and display frame rate should be required as fast as possible. In the proposed system, the holograms are extracted analytically from the target object wavefront according to Eq. (2), so high generation rate can be achieved for dynamic display. We have computed the holograms with different resolutions (computing platform: Inter i7-4470k CPU, 3.5GHz, and 16GB RAM). The collected computing time and frame rate are listed in the Table 2. From the table, we can see that when the hologram resolution is 1200 × 2400 × 2 pixels (one time computation produces two holograms), the frame rate is still 24fps. In our experiments, the used image resolution is 512 × 1024 pixels, so the supported highest frame theoretically reaches to 143fps, which is much faster than the SLM refresh rate. If we employ the higher-performance CPU or the GPU in the holograms generation, the generation rate can be speed up further. Therefore, the proposed 3D-HMD is quite suitable for the dynamic and interactive display.

Tables Icon

Table 2. Computing times and Frame rates for different resolutions.

In our prototype at hand, the collimation optics is not included, so it needs external illumination for display. This problem weakens the system integrity in a certain degree. Actually, the mini-type laser source and the optimized resin optical elements can be employed in the future to improve the prototype. The other better optical elements can be also further used to consummate the system structure. Thus, the prototype is able to be more light-weight and compact to prevent the above problems.

4. Summary

We have proposed and developed a see-through true 3D-HMD system based on the wavefront modulation with holographic grating filter. Two phase holograms are analytically extracted from the 3D object wavefront and then uploaded to the different zones of one SLM. The holographic grating is produced by the interference exposure of two plane-wave beams in a specific angle. It is employed as a frequency filter to couple the duplications of the separated holograms to display the 3D object. The prototype is fabricated by the aid of 3D printer technology and it has a compact and wearable configuration. Optical experiments have been performed to test the prototype, and the results demonstrated that the true 3D images can be reconstructed in real-time with whole information. The designed 3D-HMD is free of the A-V conflict and visual fatigue problem. The investigation provides a vigorous potential for designing and realizing the true 3D interactive display devices. It is expected that the proposed method and system may bring good promotion and open up new applications for the see-through 3D-HMD research.

This preliminary study encourages us to continue the exploration for improving the optical configuration and optimizing the modulation algorithm. The partial coherent algorithm will be investigated to improve the 3D-HMD design. We will also focus on the engineering issues to consummate the assembling design and finally achieve the real-time, full-color and interactive goal in our future work.

Funding

National Natural Science Founding of China (NSFC) (61575024, 61235002, 61420106014); Program 973 (2013CB328801, 2013CB328806); Program 863 (2015AA015905).

Acknowledgments

The authors thank the reviewers for giving pertinent comments, valuable questions and constructive suggestions on this work.

References and Links

1. J. Carmigniani, B. Furht, M. Anisetti, P. Ceravolo, E. Damiani, and M. Ivkovic, “Augmented reality technologies, system and applications,” Multimedia Tools Appl. 51(1), 341–377 (2011). [CrossRef]  

2. I. Rabbi and S. Ullah, “A survey on augmented reality challenges and tracking,” Acta Graph. 24(1–2), 29–46 (2013).

3. S. J. Watt, K. Akeley, M. O. Ernst, and M. S. Banks, “Focus cues affect perceived depth,” J. Vis. 5(10), 834–862 (2005). [CrossRef]   [PubMed]  

4. D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vis. 8(3), 33 (2008). [CrossRef]   [PubMed]  

5. J. Hong, Y. Kim, H. J. Choi, J. Hahn, J. H. Park, H. Kim, S. W. Min, N. Chen, and B. Lee, “Three-dimensional display technologies of recent interest: principles, status, and issues [Invited],” Appl. Opt. 50(34), H87–H115 (2011). [CrossRef]   [PubMed]  

6. P. V. Johnson, J. A. Parnell, J. Kim, C. D. Saunter, G. D. Love, and M. S. Banks, “Dynamic lens and monovision 3D displays to improve viewer comfort,” Opt. Express 24(11), 11808–11827 (2016). [CrossRef]   [PubMed]  

7. J. Hong, S. W. Min, and B. Lee, “Integral floating display systems for augmented reality,” Appl. Opt. 51(18), 4201–4209 (2012). [CrossRef]   [PubMed]  

8. H. Hua and B. Javidi, “A 3D integral imaging optical see-through head-mounted display,” Opt. Express 22(11), 13484–13491 (2014). [CrossRef]   [PubMed]  

9. Y. Takaki and Y. Yamaguchi, “Flat-panel see-through three-dimensional display based on integral imaging,” Opt. Lett. 40(8), 1873–1876 (2015). [CrossRef]   [PubMed]  

10. H. Deng, Q. Wang, Z. Xiong, H. Zhang, and Y. Xing, “Magnified augmented reality 3D display based on integral imaging,” Optik (Stuttg.) 127(10), 4250–4253 (2016). [CrossRef]  

11. C. K. Lee, S. Moon, S. Lee, D. Yoo, J. Y. Hong, and B. Lee, “Compact three-dimensional head-mounted display system with Savart plate,” Opt. Express 24(17), 19531–19544 (2016). [CrossRef]   [PubMed]  

12. K. Hong, J. Yeom, C. Jang, J. Hong, and B. Lee, “Full-color lens-array holographic optical element for three-dimensional optical see-through augmented reality,” Opt. Lett. 39(1), 127–130 (2014). [CrossRef]   [PubMed]  

13. H. J. Yeom, H. J. Kim, S. B. Kim, H. Zhang, B. Li, Y. M. Ji, S. H. Kim, and J. H. Park, “3D holographic head mounted display using holographic optical elements with astigmatism aberration compensation,” Opt. Express 23(25), 32025–32034 (2015). [CrossRef]   [PubMed]  

14. G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Lett. 41(11), 2486–2489 (2016). [CrossRef]   [PubMed]  

15. C. Jang, C. K. Lee, J. Jeong, G. Li, S. Lee, J. Yeom, K. Hong, and B. Lee, “Recent progress in see-through three-dimensional displays using holographic optical elements [Invited],” Appl. Opt. 55(3), A71–A85 (2016). [CrossRef]   [PubMed]  

16. E. Moon, M. Kim, J. Roh, H. Kim, and J. Hahn, “Holographic head-mounted display with RGB light emitting diode light source,” Opt. Express 22(6), 6526–6534 (2014). [CrossRef]   [PubMed]  

17. J. S. Chen and D. P. Chu, “Improved layer-based method for rapid hologram generation and real-time interactive holographic display applications,” Opt. Express 23(14), 18143–18155 (2015). [CrossRef]   [PubMed]  

18. Z. Chen, X. Sang, Q. Liu, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017). [CrossRef]  

19. Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, C. Yu, W. Dou, and L. Xiao, “Acceleration for computer-generated hologram in head-mounted display with effective diffraction area recording method for eyes,” Chin. Opt. Lett. 14(8), 080901 (2016). [CrossRef]  

20. Q. Gao, J. Liu, J. Han, and X. Li, “Monocular 3D see-through head-mounted display via complex amplitude modulation,” Opt. Express 24(15), 17372–17383 (2016). [CrossRef]   [PubMed]  

21. J. Amako, H. Miura, and T. Sonehara, “Wave-front control using liquid-crystal devices,” Appl. Opt. 32(23), 4323–4329 (1993). [CrossRef]   [PubMed]  

22. R. Shi, J. Liu, J. Xu, D. Liu, Y. Pan, J. Xie, and Y. Wang, “Designing and fabricating diffractive optical elements with a complex profile by interference,” Opt. Lett. 36(20), 4053–4055 (2011). [CrossRef]   [PubMed]  

23. H. Zhao, J. Liu, R. Xiao, X. Li, R. Shi, P. Liu, H. Zhong, B. Zou, and Y. Wang, “Modulation of optical intensity on curved surfaces and its application to fabricate DOEs with arbitrary profile by interference,” Opt. Express 21(4), 5140–5148 (2013). [CrossRef]   [PubMed]  

24. Y. Qi, C. Chang, and J. Xia, “Speckleless holographic display by complex modulation based on double-phase method,” Opt. Express 24(26), 30368–30378 (2016). [CrossRef]   [PubMed]  

25. P. M. Birch, R. Young, D. Budgett, and C. Chatwin, “Two-pixel computer-generated hologram with a zero-twist nematic liquid-crystal spatial light modulator,” Opt. Lett. 25(14), 1013–1015 (2000). [CrossRef]   [PubMed]  

26. V. Bagnoud and J. D. Zuegel, “Independent phase and amplitude control of a laser beam by use of a single-phase-only spatial light modulator,” Opt. Lett. 29(3), 295–297 (2004). [CrossRef]   [PubMed]  

27. A. Jesacher, C. Maurer, A. Schwaighofer, S. Bernet, and M. Ritsch-Marte, “Near-perfect hologram reconstruction with a spatial light modulator,” Opt. Express 16(4), 2597–2603 (2008). [CrossRef]   [PubMed]  

28. A. Jesacher, C. Maurer, A. Schwaighofer, S. Bernet, and M. Ritsch-Marte, “Full phase and amplitude control of holographic optical tweezers with high efficiency,” Opt. Express 16(7), 4479–4486 (2008). [CrossRef]   [PubMed]  

29. L. Zhu and J. Wang, “Arbitrary manipulation of spatial amplitude and phase using phase-only spatial light modulators,” Sci. Rep. 4, 7441 (2014). [CrossRef]   [PubMed]  

30. J. P. Liu, W. Y. Hsieh, T. C. Poon, and P. Tsang, “Complex Fresnel hologram display using a single SLM,” Appl. Opt. 50(34), H128–H135 (2011). [CrossRef]   [PubMed]  

31. H. Song, G. Sung, S. Choi, K. Won, H. S. Lee, and H. Kim, “Optimal synthesis of double-phase computer generated holograms using a phase-only spatial light modulator with grating filter,” Opt. Express 20(28), 29844–29853 (2012). [CrossRef]   [PubMed]  

32. S. Choi, J. Roh, H. Song, G. Sung, J. An, W. Seo, K. Won, J. Ungnapatanin, M. Jung, Y. Yoon, H. S. Lee, C. H. Oh, J. Hahn, and H. Kim, “Modulation efficiency of double-phase hologram complex light modulation macro-pixels,” Opt. Express 22(18), 21460–21470 (2014). [CrossRef]   [PubMed]  

33. T. Le, Y. Piao, J. Kim, and N. Kim, “Quality enhancement of a complex holographic display using a single spatial light modulator and a circular grating,” J. Opt. Soc. Korea 20(1), 70–77 (2016). [CrossRef]  

34. S. Reichelt, R. Häussler, G. Fütterer, N. Leister, H. Kato, N. Usukura, and Y. Kanbayashi, “Full-range, complex spatial light modulator for real-time holography,” Opt. Lett. 37(11), 1955–1957 (2012). [CrossRef]   [PubMed]  

35. C. Chang, J. Xia, L. Yang, W. Lei, Z. Yang, and J. Chen, “Speckle-suppressed phase-only holographic three-dimensional display based on double-constraint Gerchberg-Saxton algorithm,” Appl. Opt. 54(23), 6994–7001 (2015). [CrossRef]   [PubMed]  

36. Y. Qi, C. Chang, and J. Xia, “Accurate complex modulation by the iterative spatial cross-modulation method,” Chin. Opt. Lett. 15(2), 020901 (2017).

37. X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21(18), 20577–20587 (2013). [CrossRef]   [PubMed]  

38. G. Xue, J. Liu, X. Li, J. Jia, Z. Zhang, B. Hu, and Y. Wang, “Multiplexing encoding method for full-color dynamic 3D holographic display,” Opt. Express 22(15), 18473–18482 (2014). [CrossRef]   [PubMed]  

39. H. Kim, C. Y. Hwang, K. S. Kim, J. Roh, W. Moon, S. Kim, B. R. Lee, S. Oh, and J. Hahn, “Anamorphic optical transformation of an amplitude spatial light modulator to a complex spatial light modulator with square pixels [invited],” Appl. Opt. 53(27), G139–G146 (2014). [CrossRef]   [PubMed]  

40. J. W. Goodman, Introduction to Fourier Optics (Robert & Company Publishers, 2005).

References

  • View by:
  • |
  • |
  • |

  1. J. Carmigniani, B. Furht, M. Anisetti, P. Ceravolo, E. Damiani, and M. Ivkovic, “Augmented reality technologies, system and applications,” Multimedia Tools Appl. 51(1), 341–377 (2011).
    [Crossref]
  2. I. Rabbi and S. Ullah, “A survey on augmented reality challenges and tracking,” Acta Graph. 24(1–2), 29–46 (2013).
  3. S. J. Watt, K. Akeley, M. O. Ernst, and M. S. Banks, “Focus cues affect perceived depth,” J. Vis. 5(10), 834–862 (2005).
    [Crossref] [PubMed]
  4. D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vis. 8(3), 33 (2008).
    [Crossref] [PubMed]
  5. J. Hong, Y. Kim, H. J. Choi, J. Hahn, J. H. Park, H. Kim, S. W. Min, N. Chen, and B. Lee, “Three-dimensional display technologies of recent interest: principles, status, and issues [Invited],” Appl. Opt. 50(34), H87–H115 (2011).
    [Crossref] [PubMed]
  6. P. V. Johnson, J. A. Parnell, J. Kim, C. D. Saunter, G. D. Love, and M. S. Banks, “Dynamic lens and monovision 3D displays to improve viewer comfort,” Opt. Express 24(11), 11808–11827 (2016).
    [Crossref] [PubMed]
  7. J. Hong, S. W. Min, and B. Lee, “Integral floating display systems for augmented reality,” Appl. Opt. 51(18), 4201–4209 (2012).
    [Crossref] [PubMed]
  8. H. Hua and B. Javidi, “A 3D integral imaging optical see-through head-mounted display,” Opt. Express 22(11), 13484–13491 (2014).
    [Crossref] [PubMed]
  9. Y. Takaki and Y. Yamaguchi, “Flat-panel see-through three-dimensional display based on integral imaging,” Opt. Lett. 40(8), 1873–1876 (2015).
    [Crossref] [PubMed]
  10. H. Deng, Q. Wang, Z. Xiong, H. Zhang, and Y. Xing, “Magnified augmented reality 3D display based on integral imaging,” Optik (Stuttg.) 127(10), 4250–4253 (2016).
    [Crossref]
  11. C. K. Lee, S. Moon, S. Lee, D. Yoo, J. Y. Hong, and B. Lee, “Compact three-dimensional head-mounted display system with Savart plate,” Opt. Express 24(17), 19531–19544 (2016).
    [Crossref] [PubMed]
  12. K. Hong, J. Yeom, C. Jang, J. Hong, and B. Lee, “Full-color lens-array holographic optical element for three-dimensional optical see-through augmented reality,” Opt. Lett. 39(1), 127–130 (2014).
    [Crossref] [PubMed]
  13. H. J. Yeom, H. J. Kim, S. B. Kim, H. Zhang, B. Li, Y. M. Ji, S. H. Kim, and J. H. Park, “3D holographic head mounted display using holographic optical elements with astigmatism aberration compensation,” Opt. Express 23(25), 32025–32034 (2015).
    [Crossref] [PubMed]
  14. G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Lett. 41(11), 2486–2489 (2016).
    [Crossref] [PubMed]
  15. C. Jang, C. K. Lee, J. Jeong, G. Li, S. Lee, J. Yeom, K. Hong, and B. Lee, “Recent progress in see-through three-dimensional displays using holographic optical elements [Invited],” Appl. Opt. 55(3), A71–A85 (2016).
    [Crossref] [PubMed]
  16. E. Moon, M. Kim, J. Roh, H. Kim, and J. Hahn, “Holographic head-mounted display with RGB light emitting diode light source,” Opt. Express 22(6), 6526–6534 (2014).
    [Crossref] [PubMed]
  17. J. S. Chen and D. P. Chu, “Improved layer-based method for rapid hologram generation and real-time interactive holographic display applications,” Opt. Express 23(14), 18143–18155 (2015).
    [Crossref] [PubMed]
  18. Z. Chen, X. Sang, Q. Liu, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
    [Crossref]
  19. Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, C. Yu, W. Dou, and L. Xiao, “Acceleration for computer-generated hologram in head-mounted display with effective diffraction area recording method for eyes,” Chin. Opt. Lett. 14(8), 080901 (2016).
    [Crossref]
  20. Q. Gao, J. Liu, J. Han, and X. Li, “Monocular 3D see-through head-mounted display via complex amplitude modulation,” Opt. Express 24(15), 17372–17383 (2016).
    [Crossref] [PubMed]
  21. J. Amako, H. Miura, and T. Sonehara, “Wave-front control using liquid-crystal devices,” Appl. Opt. 32(23), 4323–4329 (1993).
    [Crossref] [PubMed]
  22. R. Shi, J. Liu, J. Xu, D. Liu, Y. Pan, J. Xie, and Y. Wang, “Designing and fabricating diffractive optical elements with a complex profile by interference,” Opt. Lett. 36(20), 4053–4055 (2011).
    [Crossref] [PubMed]
  23. H. Zhao, J. Liu, R. Xiao, X. Li, R. Shi, P. Liu, H. Zhong, B. Zou, and Y. Wang, “Modulation of optical intensity on curved surfaces and its application to fabricate DOEs with arbitrary profile by interference,” Opt. Express 21(4), 5140–5148 (2013).
    [Crossref] [PubMed]
  24. Y. Qi, C. Chang, and J. Xia, “Speckleless holographic display by complex modulation based on double-phase method,” Opt. Express 24(26), 30368–30378 (2016).
    [Crossref] [PubMed]
  25. P. M. Birch, R. Young, D. Budgett, and C. Chatwin, “Two-pixel computer-generated hologram with a zero-twist nematic liquid-crystal spatial light modulator,” Opt. Lett. 25(14), 1013–1015 (2000).
    [Crossref] [PubMed]
  26. V. Bagnoud and J. D. Zuegel, “Independent phase and amplitude control of a laser beam by use of a single-phase-only spatial light modulator,” Opt. Lett. 29(3), 295–297 (2004).
    [Crossref] [PubMed]
  27. A. Jesacher, C. Maurer, A. Schwaighofer, S. Bernet, and M. Ritsch-Marte, “Near-perfect hologram reconstruction with a spatial light modulator,” Opt. Express 16(4), 2597–2603 (2008).
    [Crossref] [PubMed]
  28. A. Jesacher, C. Maurer, A. Schwaighofer, S. Bernet, and M. Ritsch-Marte, “Full phase and amplitude control of holographic optical tweezers with high efficiency,” Opt. Express 16(7), 4479–4486 (2008).
    [Crossref] [PubMed]
  29. L. Zhu and J. Wang, “Arbitrary manipulation of spatial amplitude and phase using phase-only spatial light modulators,” Sci. Rep. 4, 7441 (2014).
    [Crossref] [PubMed]
  30. J. P. Liu, W. Y. Hsieh, T. C. Poon, and P. Tsang, “Complex Fresnel hologram display using a single SLM,” Appl. Opt. 50(34), H128–H135 (2011).
    [Crossref] [PubMed]
  31. H. Song, G. Sung, S. Choi, K. Won, H. S. Lee, and H. Kim, “Optimal synthesis of double-phase computer generated holograms using a phase-only spatial light modulator with grating filter,” Opt. Express 20(28), 29844–29853 (2012).
    [Crossref] [PubMed]
  32. S. Choi, J. Roh, H. Song, G. Sung, J. An, W. Seo, K. Won, J. Ungnapatanin, M. Jung, Y. Yoon, H. S. Lee, C. H. Oh, J. Hahn, and H. Kim, “Modulation efficiency of double-phase hologram complex light modulation macro-pixels,” Opt. Express 22(18), 21460–21470 (2014).
    [Crossref] [PubMed]
  33. T. Le, Y. Piao, J. Kim, and N. Kim, “Quality enhancement of a complex holographic display using a single spatial light modulator and a circular grating,” J. Opt. Soc. Korea 20(1), 70–77 (2016).
    [Crossref]
  34. S. Reichelt, R. Häussler, G. Fütterer, N. Leister, H. Kato, N. Usukura, and Y. Kanbayashi, “Full-range, complex spatial light modulator for real-time holography,” Opt. Lett. 37(11), 1955–1957 (2012).
    [Crossref] [PubMed]
  35. C. Chang, J. Xia, L. Yang, W. Lei, Z. Yang, and J. Chen, “Speckle-suppressed phase-only holographic three-dimensional display based on double-constraint Gerchberg-Saxton algorithm,” Appl. Opt. 54(23), 6994–7001 (2015).
    [Crossref] [PubMed]
  36. Y. Qi, C. Chang, and J. Xia, “Accurate complex modulation by the iterative spatial cross-modulation method,” Chin. Opt. Lett. 15(2), 020901 (2017).
  37. X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21(18), 20577–20587 (2013).
    [Crossref] [PubMed]
  38. G. Xue, J. Liu, X. Li, J. Jia, Z. Zhang, B. Hu, and Y. Wang, “Multiplexing encoding method for full-color dynamic 3D holographic display,” Opt. Express 22(15), 18473–18482 (2014).
    [Crossref] [PubMed]
  39. H. Kim, C. Y. Hwang, K. S. Kim, J. Roh, W. Moon, S. Kim, B. R. Lee, S. Oh, and J. Hahn, “Anamorphic optical transformation of an amplitude spatial light modulator to a complex spatial light modulator with square pixels [invited],” Appl. Opt. 53(27), G139–G146 (2014).
    [Crossref] [PubMed]
  40. J. W. Goodman, Introduction to Fourier Optics (Robert & Company Publishers, 2005).

2017 (2)

Z. Chen, X. Sang, Q. Liu, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Y. Qi, C. Chang, and J. Xia, “Accurate complex modulation by the iterative spatial cross-modulation method,” Chin. Opt. Lett. 15(2), 020901 (2017).

2016 (9)

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, C. Yu, W. Dou, and L. Xiao, “Acceleration for computer-generated hologram in head-mounted display with effective diffraction area recording method for eyes,” Chin. Opt. Lett. 14(8), 080901 (2016).
[Crossref]

Q. Gao, J. Liu, J. Han, and X. Li, “Monocular 3D see-through head-mounted display via complex amplitude modulation,” Opt. Express 24(15), 17372–17383 (2016).
[Crossref] [PubMed]

H. Deng, Q. Wang, Z. Xiong, H. Zhang, and Y. Xing, “Magnified augmented reality 3D display based on integral imaging,” Optik (Stuttg.) 127(10), 4250–4253 (2016).
[Crossref]

C. K. Lee, S. Moon, S. Lee, D. Yoo, J. Y. Hong, and B. Lee, “Compact three-dimensional head-mounted display system with Savart plate,” Opt. Express 24(17), 19531–19544 (2016).
[Crossref] [PubMed]

P. V. Johnson, J. A. Parnell, J. Kim, C. D. Saunter, G. D. Love, and M. S. Banks, “Dynamic lens and monovision 3D displays to improve viewer comfort,” Opt. Express 24(11), 11808–11827 (2016).
[Crossref] [PubMed]

G. Li, D. Lee, Y. Jeong, J. Cho, and B. Lee, “Holographic display for see-through augmented reality using mirror-lens holographic optical element,” Opt. Lett. 41(11), 2486–2489 (2016).
[Crossref] [PubMed]

C. Jang, C. K. Lee, J. Jeong, G. Li, S. Lee, J. Yeom, K. Hong, and B. Lee, “Recent progress in see-through three-dimensional displays using holographic optical elements [Invited],” Appl. Opt. 55(3), A71–A85 (2016).
[Crossref] [PubMed]

Y. Qi, C. Chang, and J. Xia, “Speckleless holographic display by complex modulation based on double-phase method,” Opt. Express 24(26), 30368–30378 (2016).
[Crossref] [PubMed]

T. Le, Y. Piao, J. Kim, and N. Kim, “Quality enhancement of a complex holographic display using a single spatial light modulator and a circular grating,” J. Opt. Soc. Korea 20(1), 70–77 (2016).
[Crossref]

2015 (4)

2014 (7)

G. Xue, J. Liu, X. Li, J. Jia, Z. Zhang, B. Hu, and Y. Wang, “Multiplexing encoding method for full-color dynamic 3D holographic display,” Opt. Express 22(15), 18473–18482 (2014).
[Crossref] [PubMed]

H. Kim, C. Y. Hwang, K. S. Kim, J. Roh, W. Moon, S. Kim, B. R. Lee, S. Oh, and J. Hahn, “Anamorphic optical transformation of an amplitude spatial light modulator to a complex spatial light modulator with square pixels [invited],” Appl. Opt. 53(27), G139–G146 (2014).
[Crossref] [PubMed]

K. Hong, J. Yeom, C. Jang, J. Hong, and B. Lee, “Full-color lens-array holographic optical element for three-dimensional optical see-through augmented reality,” Opt. Lett. 39(1), 127–130 (2014).
[Crossref] [PubMed]

H. Hua and B. Javidi, “A 3D integral imaging optical see-through head-mounted display,” Opt. Express 22(11), 13484–13491 (2014).
[Crossref] [PubMed]

E. Moon, M. Kim, J. Roh, H. Kim, and J. Hahn, “Holographic head-mounted display with RGB light emitting diode light source,” Opt. Express 22(6), 6526–6534 (2014).
[Crossref] [PubMed]

S. Choi, J. Roh, H. Song, G. Sung, J. An, W. Seo, K. Won, J. Ungnapatanin, M. Jung, Y. Yoon, H. S. Lee, C. H. Oh, J. Hahn, and H. Kim, “Modulation efficiency of double-phase hologram complex light modulation macro-pixels,” Opt. Express 22(18), 21460–21470 (2014).
[Crossref] [PubMed]

L. Zhu and J. Wang, “Arbitrary manipulation of spatial amplitude and phase using phase-only spatial light modulators,” Sci. Rep. 4, 7441 (2014).
[Crossref] [PubMed]

2013 (3)

2012 (3)

2011 (4)

2008 (3)

2005 (1)

S. J. Watt, K. Akeley, M. O. Ernst, and M. S. Banks, “Focus cues affect perceived depth,” J. Vis. 5(10), 834–862 (2005).
[Crossref] [PubMed]

2004 (1)

2000 (1)

1993 (1)

Akeley, K.

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vis. 8(3), 33 (2008).
[Crossref] [PubMed]

S. J. Watt, K. Akeley, M. O. Ernst, and M. S. Banks, “Focus cues affect perceived depth,” J. Vis. 5(10), 834–862 (2005).
[Crossref] [PubMed]

Amako, J.

An, J.

Anisetti, M.

J. Carmigniani, B. Furht, M. Anisetti, P. Ceravolo, E. Damiani, and M. Ivkovic, “Augmented reality technologies, system and applications,” Multimedia Tools Appl. 51(1), 341–377 (2011).
[Crossref]

Bagnoud, V.

Banks, M. S.

P. V. Johnson, J. A. Parnell, J. Kim, C. D. Saunter, G. D. Love, and M. S. Banks, “Dynamic lens and monovision 3D displays to improve viewer comfort,” Opt. Express 24(11), 11808–11827 (2016).
[Crossref] [PubMed]

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vis. 8(3), 33 (2008).
[Crossref] [PubMed]

S. J. Watt, K. Akeley, M. O. Ernst, and M. S. Banks, “Focus cues affect perceived depth,” J. Vis. 5(10), 834–862 (2005).
[Crossref] [PubMed]

Bernet, S.

Birch, P. M.

Budgett, D.

Carmigniani, J.

J. Carmigniani, B. Furht, M. Anisetti, P. Ceravolo, E. Damiani, and M. Ivkovic, “Augmented reality technologies, system and applications,” Multimedia Tools Appl. 51(1), 341–377 (2011).
[Crossref]

Ceravolo, P.

J. Carmigniani, B. Furht, M. Anisetti, P. Ceravolo, E. Damiani, and M. Ivkovic, “Augmented reality technologies, system and applications,” Multimedia Tools Appl. 51(1), 341–377 (2011).
[Crossref]

Chang, C.

Chatwin, C.

Chen, J.

Chen, J. S.

Chen, N.

Chen, Z.

Z. Chen, X. Sang, Q. Liu, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, C. Yu, W. Dou, and L. Xiao, “Acceleration for computer-generated hologram in head-mounted display with effective diffraction area recording method for eyes,” Chin. Opt. Lett. 14(8), 080901 (2016).
[Crossref]

Cho, J.

Choi, H. J.

Choi, S.

Chu, D. P.

Damiani, E.

J. Carmigniani, B. Furht, M. Anisetti, P. Ceravolo, E. Damiani, and M. Ivkovic, “Augmented reality technologies, system and applications,” Multimedia Tools Appl. 51(1), 341–377 (2011).
[Crossref]

Deng, H.

H. Deng, Q. Wang, Z. Xiong, H. Zhang, and Y. Xing, “Magnified augmented reality 3D display based on integral imaging,” Optik (Stuttg.) 127(10), 4250–4253 (2016).
[Crossref]

Dou, W.

Ernst, M. O.

S. J. Watt, K. Akeley, M. O. Ernst, and M. S. Banks, “Focus cues affect perceived depth,” J. Vis. 5(10), 834–862 (2005).
[Crossref] [PubMed]

Furht, B.

J. Carmigniani, B. Furht, M. Anisetti, P. Ceravolo, E. Damiani, and M. Ivkovic, “Augmented reality technologies, system and applications,” Multimedia Tools Appl. 51(1), 341–377 (2011).
[Crossref]

Fütterer, G.

Gao, Q.

Gao, X.

Z. Chen, X. Sang, Q. Liu, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, C. Yu, W. Dou, and L. Xiao, “Acceleration for computer-generated hologram in head-mounted display with effective diffraction area recording method for eyes,” Chin. Opt. Lett. 14(8), 080901 (2016).
[Crossref]

Girshick, A. R.

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vis. 8(3), 33 (2008).
[Crossref] [PubMed]

Hahn, J.

Han, J.

Häussler, R.

Hoffman, D. M.

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vis. 8(3), 33 (2008).
[Crossref] [PubMed]

Hong, J.

Hong, J. Y.

Hong, K.

Hsieh, W. Y.

Hu, B.

Hua, H.

Hwang, C. Y.

Ivkovic, M.

J. Carmigniani, B. Furht, M. Anisetti, P. Ceravolo, E. Damiani, and M. Ivkovic, “Augmented reality technologies, system and applications,” Multimedia Tools Appl. 51(1), 341–377 (2011).
[Crossref]

Jang, C.

Javidi, B.

Jeong, J.

Jeong, Y.

Jesacher, A.

Ji, Y. M.

Jia, J.

Johnson, P. V.

Jung, M.

Kanbayashi, Y.

Kato, H.

Kim, H.

Kim, H. J.

Kim, J.

Kim, K. S.

Kim, M.

Kim, N.

Kim, S.

Kim, S. B.

Kim, S. H.

Kim, Y.

Le, T.

Lee, B.

Lee, B. R.

Lee, C. K.

Lee, D.

Lee, H. S.

Lee, S.

Lei, W.

Leister, N.

Li, B.

Li, G.

Li, J.

Z. Chen, X. Sang, Q. Liu, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, C. Yu, W. Dou, and L. Xiao, “Acceleration for computer-generated hologram in head-mounted display with effective diffraction area recording method for eyes,” Chin. Opt. Lett. 14(8), 080901 (2016).
[Crossref]

Li, X.

Lin, Q.

Liu, D.

Liu, J.

Liu, J. P.

Liu, P.

Liu, Q.

Z. Chen, X. Sang, Q. Liu, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Love, G. D.

Maurer, C.

Min, S. W.

Miura, H.

Moon, E.

Moon, S.

Moon, W.

Oh, C. H.

Oh, S.

Pan, Y.

Park, J. H.

Parnell, J. A.

Piao, Y.

Poon, T. C.

Qi, Y.

Rabbi, I.

I. Rabbi and S. Ullah, “A survey on augmented reality challenges and tracking,” Acta Graph. 24(1–2), 29–46 (2013).

Reichelt, S.

Ritsch-Marte, M.

Roh, J.

Sang, X.

Z. Chen, X. Sang, Q. Liu, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, C. Yu, W. Dou, and L. Xiao, “Acceleration for computer-generated hologram in head-mounted display with effective diffraction area recording method for eyes,” Chin. Opt. Lett. 14(8), 080901 (2016).
[Crossref]

Saunter, C. D.

Schwaighofer, A.

Seo, W.

Shi, R.

Sonehara, T.

Song, H.

Sung, G.

Takaki, Y.

Tsang, P.

Ullah, S.

I. Rabbi and S. Ullah, “A survey on augmented reality challenges and tracking,” Acta Graph. 24(1–2), 29–46 (2013).

Ungnapatanin, J.

Usukura, N.

Wang, J.

L. Zhu and J. Wang, “Arbitrary manipulation of spatial amplitude and phase using phase-only spatial light modulators,” Sci. Rep. 4, 7441 (2014).
[Crossref] [PubMed]

Wang, K.

Z. Chen, X. Sang, Q. Liu, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Wang, Q.

H. Deng, Q. Wang, Z. Xiong, H. Zhang, and Y. Xing, “Magnified augmented reality 3D display based on integral imaging,” Optik (Stuttg.) 127(10), 4250–4253 (2016).
[Crossref]

Wang, Y.

Watt, S. J.

S. J. Watt, K. Akeley, M. O. Ernst, and M. S. Banks, “Focus cues affect perceived depth,” J. Vis. 5(10), 834–862 (2005).
[Crossref] [PubMed]

Won, K.

Xia, J.

Xiao, L.

Xiao, R.

Xie, J.

Xie, S.

Z. Chen, X. Sang, Q. Liu, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Xing, Y.

H. Deng, Q. Wang, Z. Xiong, H. Zhang, and Y. Xing, “Magnified augmented reality 3D display based on integral imaging,” Optik (Stuttg.) 127(10), 4250–4253 (2016).
[Crossref]

Xiong, Z.

H. Deng, Q. Wang, Z. Xiong, H. Zhang, and Y. Xing, “Magnified augmented reality 3D display based on integral imaging,” Optik (Stuttg.) 127(10), 4250–4253 (2016).
[Crossref]

Xu, J.

Xue, G.

Yamaguchi, Y.

Yan, B.

Z. Chen, X. Sang, Q. Liu, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, C. Yu, W. Dou, and L. Xiao, “Acceleration for computer-generated hologram in head-mounted display with effective diffraction area recording method for eyes,” Chin. Opt. Lett. 14(8), 080901 (2016).
[Crossref]

Yang, L.

Yang, Z.

Yeom, H. J.

Yeom, J.

Yoo, D.

Yoon, Y.

Young, R.

Yu, C.

Z. Chen, X. Sang, Q. Liu, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, C. Yu, W. Dou, and L. Xiao, “Acceleration for computer-generated hologram in head-mounted display with effective diffraction area recording method for eyes,” Chin. Opt. Lett. 14(8), 080901 (2016).
[Crossref]

Yu, X.

Z. Chen, X. Sang, Q. Liu, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, C. Yu, W. Dou, and L. Xiao, “Acceleration for computer-generated hologram in head-mounted display with effective diffraction area recording method for eyes,” Chin. Opt. Lett. 14(8), 080901 (2016).
[Crossref]

Zhang, H.

Zhang, Z.

Zhao, H.

Zhong, H.

Zhu, L.

L. Zhu and J. Wang, “Arbitrary manipulation of spatial amplitude and phase using phase-only spatial light modulators,” Sci. Rep. 4, 7441 (2014).
[Crossref] [PubMed]

Zou, B.

Zuegel, J. D.

Acta Graph. (1)

I. Rabbi and S. Ullah, “A survey on augmented reality challenges and tracking,” Acta Graph. 24(1–2), 29–46 (2013).

Appl. Opt. (7)

J. Hong, Y. Kim, H. J. Choi, J. Hahn, J. H. Park, H. Kim, S. W. Min, N. Chen, and B. Lee, “Three-dimensional display technologies of recent interest: principles, status, and issues [Invited],” Appl. Opt. 50(34), H87–H115 (2011).
[Crossref] [PubMed]

J. Hong, S. W. Min, and B. Lee, “Integral floating display systems for augmented reality,” Appl. Opt. 51(18), 4201–4209 (2012).
[Crossref] [PubMed]

C. Jang, C. K. Lee, J. Jeong, G. Li, S. Lee, J. Yeom, K. Hong, and B. Lee, “Recent progress in see-through three-dimensional displays using holographic optical elements [Invited],” Appl. Opt. 55(3), A71–A85 (2016).
[Crossref] [PubMed]

J. Amako, H. Miura, and T. Sonehara, “Wave-front control using liquid-crystal devices,” Appl. Opt. 32(23), 4323–4329 (1993).
[Crossref] [PubMed]

J. P. Liu, W. Y. Hsieh, T. C. Poon, and P. Tsang, “Complex Fresnel hologram display using a single SLM,” Appl. Opt. 50(34), H128–H135 (2011).
[Crossref] [PubMed]

C. Chang, J. Xia, L. Yang, W. Lei, Z. Yang, and J. Chen, “Speckle-suppressed phase-only holographic three-dimensional display based on double-constraint Gerchberg-Saxton algorithm,” Appl. Opt. 54(23), 6994–7001 (2015).
[Crossref] [PubMed]

H. Kim, C. Y. Hwang, K. S. Kim, J. Roh, W. Moon, S. Kim, B. R. Lee, S. Oh, and J. Hahn, “Anamorphic optical transformation of an amplitude spatial light modulator to a complex spatial light modulator with square pixels [invited],” Appl. Opt. 53(27), G139–G146 (2014).
[Crossref] [PubMed]

Chin. Opt. Lett. (2)

J. Opt. Soc. Korea (1)

J. Vis. (2)

S. J. Watt, K. Akeley, M. O. Ernst, and M. S. Banks, “Focus cues affect perceived depth,” J. Vis. 5(10), 834–862 (2005).
[Crossref] [PubMed]

D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence-accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vis. 8(3), 33 (2008).
[Crossref] [PubMed]

Multimedia Tools Appl. (1)

J. Carmigniani, B. Furht, M. Anisetti, P. Ceravolo, E. Damiani, and M. Ivkovic, “Augmented reality technologies, system and applications,” Multimedia Tools Appl. 51(1), 341–377 (2011).
[Crossref]

Opt. Commun. (1)

Z. Chen, X. Sang, Q. Liu, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017).
[Crossref]

Opt. Express (15)

E. Moon, M. Kim, J. Roh, H. Kim, and J. Hahn, “Holographic head-mounted display with RGB light emitting diode light source,” Opt. Express 22(6), 6526–6534 (2014).
[Crossref] [PubMed]

J. S. Chen and D. P. Chu, “Improved layer-based method for rapid hologram generation and real-time interactive holographic display applications,” Opt. Express 23(14), 18143–18155 (2015).
[Crossref] [PubMed]

C. K. Lee, S. Moon, S. Lee, D. Yoo, J. Y. Hong, and B. Lee, “Compact three-dimensional head-mounted display system with Savart plate,” Opt. Express 24(17), 19531–19544 (2016).
[Crossref] [PubMed]

H. J. Yeom, H. J. Kim, S. B. Kim, H. Zhang, B. Li, Y. M. Ji, S. H. Kim, and J. H. Park, “3D holographic head mounted display using holographic optical elements with astigmatism aberration compensation,” Opt. Express 23(25), 32025–32034 (2015).
[Crossref] [PubMed]

P. V. Johnson, J. A. Parnell, J. Kim, C. D. Saunter, G. D. Love, and M. S. Banks, “Dynamic lens and monovision 3D displays to improve viewer comfort,” Opt. Express 24(11), 11808–11827 (2016).
[Crossref] [PubMed]

H. Hua and B. Javidi, “A 3D integral imaging optical see-through head-mounted display,” Opt. Express 22(11), 13484–13491 (2014).
[Crossref] [PubMed]

Q. Gao, J. Liu, J. Han, and X. Li, “Monocular 3D see-through head-mounted display via complex amplitude modulation,” Opt. Express 24(15), 17372–17383 (2016).
[Crossref] [PubMed]

H. Zhao, J. Liu, R. Xiao, X. Li, R. Shi, P. Liu, H. Zhong, B. Zou, and Y. Wang, “Modulation of optical intensity on curved surfaces and its application to fabricate DOEs with arbitrary profile by interference,” Opt. Express 21(4), 5140–5148 (2013).
[Crossref] [PubMed]

Y. Qi, C. Chang, and J. Xia, “Speckleless holographic display by complex modulation based on double-phase method,” Opt. Express 24(26), 30368–30378 (2016).
[Crossref] [PubMed]

X. Li, J. Liu, J. Jia, Y. Pan, and Y. Wang, “3D dynamic holographic display by modulating complex amplitude experimentally,” Opt. Express 21(18), 20577–20587 (2013).
[Crossref] [PubMed]

G. Xue, J. Liu, X. Li, J. Jia, Z. Zhang, B. Hu, and Y. Wang, “Multiplexing encoding method for full-color dynamic 3D holographic display,” Opt. Express 22(15), 18473–18482 (2014).
[Crossref] [PubMed]

H. Song, G. Sung, S. Choi, K. Won, H. S. Lee, and H. Kim, “Optimal synthesis of double-phase computer generated holograms using a phase-only spatial light modulator with grating filter,” Opt. Express 20(28), 29844–29853 (2012).
[Crossref] [PubMed]

S. Choi, J. Roh, H. Song, G. Sung, J. An, W. Seo, K. Won, J. Ungnapatanin, M. Jung, Y. Yoon, H. S. Lee, C. H. Oh, J. Hahn, and H. Kim, “Modulation efficiency of double-phase hologram complex light modulation macro-pixels,” Opt. Express 22(18), 21460–21470 (2014).
[Crossref] [PubMed]

A. Jesacher, C. Maurer, A. Schwaighofer, S. Bernet, and M. Ritsch-Marte, “Near-perfect hologram reconstruction with a spatial light modulator,” Opt. Express 16(4), 2597–2603 (2008).
[Crossref] [PubMed]

A. Jesacher, C. Maurer, A. Schwaighofer, S. Bernet, and M. Ritsch-Marte, “Full phase and amplitude control of holographic optical tweezers with high efficiency,” Opt. Express 16(7), 4479–4486 (2008).
[Crossref] [PubMed]

Opt. Lett. (7)

Optik (Stuttg.) (1)

H. Deng, Q. Wang, Z. Xiong, H. Zhang, and Y. Xing, “Magnified augmented reality 3D display based on integral imaging,” Optik (Stuttg.) 127(10), 4250–4253 (2016).
[Crossref]

Sci. Rep. (1)

L. Zhu and J. Wang, “Arbitrary manipulation of spatial amplitude and phase using phase-only spatial light modulators,” Sci. Rep. 4, 7441 (2014).
[Crossref] [PubMed]

Other (1)

J. W. Goodman, Introduction to Fourier Optics (Robert & Company Publishers, 2005).

Supplementary Material (2)

NameDescription
» Visualization 1: MOV (807 KB)      Focused images at different depths
» Visualization 2: MOV (622 KB)      Dynamic 3D see-through display

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 (a) Schematic of the proposed see-through 3D-HMD system, (b) the wearing effect for the AR application. The two real objects ‘Car’ and ‘Rubic’ locate at d1 and d2 distances with the modulated 3D instruction signals, respectively.
Fig. 2
Fig. 2 Illustration of the wavefront modulation with grating filter.
Fig. 3
Fig. 3 (a) Illustration of the holographic grating fabrication, (b) a fabricated example.
Fig. 4
Fig. 4 The Developed 3D-HMD prototype, (a) its detailed configuration, (b) the assembled module, and (c) the wearing performance.
Fig. 5
Fig. 5 The testing experimental facility with two real objects (sunflower and monkey).
Fig. 6
Fig. 6 (a) Illustration of the system FOV, the exit pupil and the distance of the virtual SLM are both 10mm, the BS coupler is 25.4mm, and the viewing window is 20.1mm, (b) the example of a calibrated hologram.
Fig. 7
Fig. 7 Reconstructed 3D images at different focused depths, (a) 3, d1 = 134mm, and (b) D, d2 = 184mm.
Fig. 8
Fig. 8 Focused images at different depths (Visualization 1), (a) spatial distribution of the 3D signals with 20mm intervals, and (b)-(f) are the focused images at 134mm, 154mm, 174mm, 194mm and 214mm, respectively.
Fig. 9
Fig. 9 Dynamic 3D see-through display (Visualization 2), (a)-(f) are the extracted frames.

Tables (2)

Tables Icon

Table 1 Parameters of the prototype.

Tables Icon

Table 2 Computing times and Frame rates for different resolutions.

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

exp ( i θ 1 ) + exp ( i θ 2 ) = A exp ( i θ )
{ θ 1 = θ + c o s 1 ( A 2 ) θ 2 = θ c o s 1 ( A 2 )
U ( x g , y g ) = 1 i λ f F t { h 1 ( x s , y s d h 2 ) + h 2 ( x s , y s + d h 2 ) } G ( x g , y g ) = 1 i λ f H ( x g λ f , y g λ f ) G ( x g , y g )
G ( x g , y g ) = 1 2 + m 2 cos ( 2 π y g Δ )
U ( x c , y c ) = 1 i λ f F t { 1 i λ f H ( x g λ f , y g λ f ) [ 1 2 + m 2 cos ( 2 π y g Δ ) ] } = [ h 1 ( x c , y c d h 2 ) + h 2 ( x c , y c + d h 2 ) ] + m 4 [ h 1 ( x c , y c d h 2 + λ f Δ ) + h 2 ( x c , y c + d h 2 λ f Δ ) ] + m 4 [ h 1 ( x c , y c d h 2 λ f Δ ) + h 2 ( x c , y c + d h 2 + λ f Δ ) ]
d h = 2 π f Δ
U ( x c , y c ) = ... + m 4 [ h 1 ( x c , y c ) + h 2 ( x c , y c ) ] + ... = ... + m 4 [ exp ( i θ 1 ) + exp ( i θ 2 ) ] + ... = ... + m 4 A exp ( i θ ) + ...
O ( x , y ) = 1 i λ z o exp ( i k z o ) U ( x c , y c ) exp { i k 2 z o [ ( x x c ) 2 + ( y y c ) 2 ] } d x c d y c
I = | exp ( i k 1 r ) + exp ( i k 2 r ) | 2 = 2 + 2 cos [ ( k 1 k 2 ) r ]
Δ = λ 2 sin ( β 2 )

Metrics