A novel time-multiplexed multi-view three-dimensional (3D) display has been implemented using a projector array to provide the image source and an angular steering-screen module to generate multiple high density horizontal views. The liquid crystal (LC)-based steering screen was specially developed to deflect light beams over a small range and operate in synchronism with the projector array with the use of a customized FPGA driver. The prototype produces vivid color 3D scenes with smooth parallax to multiple viewers. The experimental results verify the proposed multi-projection time-multiplexed multi-view 3D display method that uses a steering screen to produce dense views. Displaying both static and dynamic 3D contents is achieved in our implemented 36-view 3D display prototype. The results of crosstalk measurements are given and analyzed to evaluate the display performance.
© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement
The current developments in information technology and computer science have driven an increasing demand for 3D displays in recent years. Most current commercial 3D displays are glasses-based, these include: 3D movies, 3D TV and virtual reality (VR)/augmented reality (AR) headsets. However, as these products require the wearing of special goggles to direct the stereoscopic images for the left and right eyes separately, they are generally not acceptable to users. Also, prolonged viewing may lead to users’ discomfort due to accommodation/ convergence (A/C) conflict where the eyes converge at the apparent distance of an object but focus at the screen. To solve this drawback, many kinds of glasses-free 3D displays are reported [1,2], such as volumetric displays , multi-view displays , holographic displays  and light field displays [6–9].
One type of glasses-free 3D displays is multi-view which usually employ 2D displays and some attached optical devices to direct multi-view images to several discrete viewing regions to implement both binocular parallax and motion parallax for the viewers. In earlier displays, a parallax barrier or lenticular sheet was placed in front of a flat panel display to redirect the sub-images to different views. Parallax-barrier-based displays suffer from low brightness since the front barrier blocks most of the light from the display panel. This problem does not occur when a lenticular sheet replaces the parallax barrier. Both parallax barrier and lenticular sheet-based displays reduce the resolution by a factor of the number of views. The lower angular resolution also produces a poor depth of field (DOF).
To overcome these drawbacks, various research has been reported where crosstalk is reduced, and the view density is increased to provide much smoother parallax. This is implemented by using high frame rate spatial light modulators (SLM) combined with different special optical system designs [9–12]. The basic concept is to take advantage of high-speed SLMs to spatially scan images for time-sequential multiple views. Another way is to redirect multiple 2D displays into different views in parallel, which requires special optical systems. Takaki proposed multi-projector based 72-view  and 128-view  3D displays which utilized a 2D SLM array, with a 2D lens array as the image source. A special diffuser and optics were employed to re-distribute the 2D arranged multi-view images in the horizontal direction only. Replacing the 2D SLM array with the lenticular-based 3D display panel as the image source, super multi-view 3D display with 256 views  was implemented with a similar optical system. It is also reported  that in this kind of super multi-view 3D display, multiple views entering one eye can overcome the A/C conflict problem to make viewing more natural and comfortable.
In recent years, with the development of the portable projector technologies, the compact micro-projector array has also been applied as the image source to build various types of 3D displays. Micro-projector arrays with different configurations are employed to achieve high-resolution or large-scale horizontal-parallax-only light field 3D displays [17–21]. Multi-layer arrangement of the multiple projectors was utilized to construct the projector array with dense angular pitch [18, 19]. The projectors at different heights may produce different incident angles in the vertical direction which may cause the corresponding narrow strips to be not exactly vertical. In the arrangement, both horizontal pitch and vertical pitch need to be well optimized based on the optics specification of the directional diffuser. Therefore, the mechanical design is rather difficult and it needs very precise calibration to get better display effect. Some other researchers are mainly focused on the improvement of the optical and electronic devices for 3D displays [22–27].
An LC-based steering screen  has been developed to change the ray directions over a small angular range. It takes the advantage of the LC’s birefringence to control the steering rays’ directions accurately with different driving voltages. Combined with this kind of steering screen, we propose a novel portable projector array-based multi-view 3D display in this paper. An experimental prototype of 36-view multi-view 3D display with the LC based steering screen has been demonstrated that displays good static and dynamic 3D scenes. The display performance is also analyzed and discussed in detail in Section 3. The combination of angular steering screen overcomes the previous complicated multi-layered projector array mechanical configurations. This approach can be directly extended to implement hundreds of views with an increase in the number of projectors.
2. Configuration and principle of this projection-based multi-view 3D display
In this section, the system configuration of the display is introduced, and the operating principle is described in detail. The function and structure of the angular steering screen, designed specifically to improve the view density of the multi-projection-based 3D display, are presented.
2.1 System configuration
The principal components of the display, which are a projector array, an optical screen and an angular steering screen, are shown in Fig. 1. The projector array consists of multiple pico projectors which are arranged horizontally. All the lens pupils of the projectors are located at the same height as the center of the front screen and the projectors are put in portrait orientation to make the neighboring projectors’ pupils closer to each other. This improves the view density and provides smoother parallax for the viewer.
The optical screen is composed of an optical lens (Fresnel lens) and vertical diffuser. The lens is used to make the rays from each projector converge at an eye position in front of the screen. The vertical diffuser is located close to the lens and is utilized to diffuse rays vertically so that viewers at different heights can observe an image over the complete height of the screen. The specially developed LC steering screen is positioned in front of the optical screen to deflect the ray directions through a small angle that is determined by its driving voltage. If there are NS directions generated by the steering screen, each projector could correspondingly create NS views in time sequence by an active time-multiplexing device, enabling viewers in front of the screen to watch 3D images with smooth motion parallax.
2.2 Display principle and image mapping method
The display principle is described in this subsection. Figure 2 shows the top view of the proposed 3D display system. The rays from the projector array go through the screen lens, vertical diffuser and angular steering screen in sequence. The screen Fresnel lens images the projector pupils to the view positions with the vertical diffuser only diffusing light in the vertical direction. And the angular steering screen is employed to change the direction of the emergent ray over a small range to create more continual viewpoints in the viewing zone. The steering angle of the steering screen is dependent on the design and driving voltage of the LC devices. As shown in Fig. 2, a real image of the lens of arbitrary projector Pi in the array is formed at viewpoint Vi in the viewing zone when the steering screen has no deflection angle. Therefore, an eye located at Vi observes an image from projector Pi over the complete width of the screen.
The distance between the projector and screen is DP, and the distance between the viewing zone and screen is DV. It can be obtained by the following relationship:
where fs is the focal length of the screen lens. Therefore, when the projector pitch is p, the view pitch is and the corresponding angular pitch θP is. When NP projectors are employed in the projector array, the total field of view (FOV) is NPθP and the horizontal viewing range is .
Assuming the steering screen has NS steering angles, it can generate NS viewpoints with driving voltages that are synchronized to the projectors. The deflection angle pitch θS is obtained by:
As shown in Fig. 2, when the steering screen deflects the rays with the angle of θS, the viewpoint Vi moves to Vi+1, which is equivalent to a virtual projector Pi+1 being located adjacent to the actual projector Pi. Similarly, when the steering screen deflects the rays with angle -θS, the virtual projector Pi-1 is created, corresponding to the viewpoint Vi-1. It can therefore generate NPNS views in the same horizontal viewing range. The view pitch is decreased from to . The view density is significantly improved with the utilization of an angular steering screen. It also overcomes the drawback of previous projection-based multi-view 3D displays which generally employ a complicated multi-layered projector array mechanical configurations.
For rendering we usually capture the images at multiple viewing positions and then re-map them to the display plane with further processing. Different system designs of multi-view 3D displays may need different formats for the image source input. In our display, we get the direct mapping relationship between the 3D scenes and the projected images of the projector array, which will simplify the intermediate complicated computation processes. Since each real or virtual projector corresponds to its own viewing position in the viewing zone, we could consider the scenario with one projector, as shown in Fig. 3. The lens pupil of one projector is located at P(xP, yP, zP), and the screen is positioned at the xoy plane. According to the imaging relations, the x and z values of the corresponding viewpoint V is obtained by:
With the utilization of the vertical diffuser, the viewpoint is free over a certain range vertically. In Fig. 3, an arbitrary spatial point Q(x, y, z) in the 3D scene emits rays in all directions. There is only the ray QV which could hit the viewpoint V. It means that the point QS on the screen needs to display the information of the ray QV to reconstruct the spatial point Q for this view. Based on the imaging geometry, the point QS could be expressed as:
Then the projector P is required to project the ray PQS to display the point QS on the screen and reconstruct the ray QV for this view. Based on the 3D location and pose of projector P, we could obtain the transform matrix Mt, which is utilized to map spatial point to the projecting image. The corresponding projecting image IP for projector P could be transformed from point QS and the transform matrix Mt. It should be noted that the transformation matrix Mt is a simplified expression. Actually the transformation matrix Mt mainly includes two parts; one is the transformation from the point QS to the projector’s coordinate, and the other part is the perspective projection transformation to the image plane of the projector.
Assuming the optical axis of the projector P is heading to the center of the screen O, we can define this optical axis as z' axis of the projector’s coordinate and P as the origin of this coordinate as shown in Fig. 3. So the unit direction vector of the projector’s coordinate x'y'z' could be expressed as:
Therefore, the point QS in the projector’s coordinate could be expressed as:
where, T is a 3 × 3 coordinate transformation matrix from the world coordinate to the projector’s coordinate.
The next step is to map the point QS in the projector’s coordinate to the image plane of the projector. The resolution of the projector is Mproj × Nproj. The projector’s horizontal FOV is 2θx and its vertical FOV is 2θy. According to the geometric perspective relationship, we could map the point QS to the image plane and get the corresponding pixel (u, v) in the projecting image IP:
Where, round() is a function whose value is the closest integer to the value in the bracket. Following the above transformation process, we could obtain the pixel (u, v) in the projecting image IP to reconstruct the ray QV for this view.
Therefore, we could get a direct mapping relationship between the reconstructed 3D scenes and each projector’s image, which benefits the 3D scene rendering process.
2.3 Steering screen
In our group’s previous device research, we developed an LC-based steering screen whose structure comprised two cascaded layers of twisted nematic (TN) LC cell and birefringent LC micro-prism array (LCMPA) pairs . Broadband light over a large aperture can be controlled with a precise steering angle. It functions by using the rotation of polarization with the LC cells and utilizing the two different refractive indices of the prism birefringent material. As reported in , the response time of the in-house TN LC is about 100ms. So the highest frequency of our steering screen is about 10Hz. For Pi cell LC , the response time is usually about 3ms. So the highest frequency of this pi cell LC is about 333Hz.So the response time of a pi cell LC is much faster than a TN LC. Since the frame rate of commercial projectors is around 60Hz with time sequential RGB, the frame rate of each color channel is 180Hz which is much lower than the highest frame rate of pi cell LC. Pi cell LC is suitable for our design of the angular steering screen. We designed the new pi cell-based angular steering screen using a similar cascaded structure. With time multiplexing, the steering angle of the screen changes with the driving voltage. In actual usage of our experiments, the frame rate of the angular steering screen is 60Hz, which is the same as the frame rate of the projector. The number of different steering angles is determined by the number of layers of the Pi cell LC and LCMPA. Assuming the steering angle of each layer is the same, when there are n layers of the Pi cell LC and LCMPA, it will create 2n states with n + 1 different steering angles in general. Considering the frame rate of the display, the number of the steering angles (NS) cannot be set too high in actual usage. The commercial projector frame rate is around 60Hz, so we set NS as 3 in our system.
The principle of the angular steering screen is illustrated in Fig. 4 where two pairs of pi cell and LCMPA are utilized to steer the rays passing through this screen. Incident light from the projector array is first converted to linear polarized light with a linear polarizer. The pi cell is employed to change the polarization state of the incident linearly polarized light. The LCMPA is composed of a micro-prism array substrate and an LC layer. With the birefringence characteristic of the LC in the LCMPA, the deflection angle is determined by the polarization state of the incident light, the apex angle of the prism array and the birefringence of the LC material. The pair of identical LCMPAs are antiparallel to create symmetrical steering angles and there are four output conditions as shown in Figs. 4(a)-4(d) that are obtained with different driving voltages for the two pi cell LC layers. Since both prism layers have the same facet angle, the deflection angles of condition (a) and (c) are both 0° and the ray direction is not changed, and there is just a small horizontal offset. The absolute value of the deflection angle βb in condition (b) is also equal to deflection angle βd in condition (d). Therefore, there are three different output states with the deflection angles of –β, 0 and β. From Eq. (2), The deflection angle β can be expressed as . Therefore, the angular steering screen enables a projector array with NP projectors to provide 3NP views over the same horizontal viewing range.
3. Experiments and results
In our experimental prototype using the design described, the horizontally-aligned projection array comprises twelve 640 × 360 DLP pico projectors (Philips PPX4350) whose brightness is a maximum of 50 lumens. The FOV of each projector is 25.6°(H) × 14.6°(V), the aspect ratio is 16:9, the physical size is 97mm × 54mm × 17mm and the projecting distance DP is 500 mm. All the pico projectors are mounted in portrait orientation and the corresponding lens pupils are located at the same height as the center of the screen. The angular pitch θP of the projector array is 3°, the screen lens is a 180 mm focal length Fresnel lens and the vertical diffuser has an elliptical output with 1° FWHM minor axis and 40° FWHM major axis. The dimensions of the LCMPA limit the effective display region to 130mm × 130mm, giving an effective resolution for each projector of 360 × 360. Therefore, the viewing distance DV is about 281mm and the horizontal viewing range is about 180mm. Since the steering screen has three deflection angles β of −1.0°, 0° and 1.0°, Each view width is about 4.8mm, which is comparable to that of the human eye pupil. The specifications of the prototype are summarized in Table 1.
The system control is illustrated in Fig. 5(a). Two professional graphic cards for a multiple screen solution (NVIDIA NVS 810) are utilized to drive the 12 pico projectors. Each graphic card (NVS 810) has 8 mini Display Port outputs and each output can support up to 4K resolution. Since the pico projector only supports HDMI input, the mini Display Port must be transformed to HDMI in actual usage. In our design configuration, six video outputs of each graphic card are separately connected to six pico projectors with HDMI interfaces. With time multiplexing, the steering screen should be precisely synchronized with all projectors. Therefore, one FPGA-based synchronization controller is used to precisely control the steering screen. One of the projector array HDMI cables is connected to the FPGA synchronization controller to grab the sync signal from video output of the graphic card in the PC. This sync signal is then transformed to the digital signal required for the LCMPA pi cell drivers. The latency between the projector and the steering screen is measured and controlled precisely by the FPGA controller. At the same time as grabbing the sync signal, the FPGA controller also has the same HDMI output for the corresponding pico projector. Some specific patterns are used to determine the relationship between the steering angle and the projection images.
The experimental setup of the prototype is shown in Fig. 5(b). All the rear pico projectors are facing the effective region of the front steering screen. In the manual adjustment, a crosshair is utilized to display in the center of each projector for the alignment of the center marker on the screen. Due to the limitation of manual adjustment, all the pico projectors are accurately calibrated to get the 3D locations and corresponding poses following the traditional calibration method . All these values of the projectors are inputted to the scene rendering software. We employ an FPGA board to synchronize the projector array and the steering screen, and the refresh rate for each view is 20Hz due to the 3 × time multiplexing and the 60 Hz operation of the steering screen.
By using the GPU of the graphic cards, images for 12 pico projectors can be rendered in one frame. Angular steering enables 3 frames to render the complete 36 views using only 12 projectors, so with a good GPU it has the ability to display dynamic 3D scenes. We designed two 3D scenes to demonstrate the rendering technique; one is a simple scene with cube and volleyball, and the other is three cartoon toys. The photos taken from different views are shown in Fig. 6. Figure 6(a) is the photos of displayed cube and volleyball taken from left to right, and Fig. 6(b) is the photos of displayed three toys taken from left to right. From the experimental results, the displayed 3D scenes have a good occlusion effect, and displayed models at different depths could be readily distinguished. Also, the proposed 3D display has smooth parallax when viewers move their heads across the viewing zones.
We also designed a dynamic 3D model of a running girl as shown in Fig. 7(a). The 3D running girl is rendered in the PC and displayed in our system. The video (see Visualization 1) was captured from left to right and then right to left. Firstly, the camera was moved from left side to right side and then out of the viewing range. At this time, the frames are dark. After that, the camera was moved back to the left into the viewing range. From the recorded video, it can present a good 3D effect and smooth parallax. Figures. 7(b) and 7(c) are the screenshots from the recorded video.
Crosstalk  is a key performance factor in the multi-view 3D display evaluation and was measured with the setup shown in Fig. 8(a). An industrial CCD camera located at the viewing zone is utilized to simulate the human eye. The camera is mounted on a two-dimensional moving stage which could move in both horizontal and vertical directions. The camera is located at the same height as the center of the screen and only moved horizontally at the viewing distance. Figure 8(b) illustrates the measurement results of normalized crosstalk of the 36 views. The curves with the same color are the measured intensities at different horizontal positions corresponding to the same projector. The curves with dash, solid and dash-dot lines correspond to the three states of the angular steering screen respectively. For N-view multi-view 3D display, the actual image at ith view position comes mostly from the correct view, and rarely from the other N-1 views. Therefore we use the ratio of the overall intensities from all the incorrect N-1 views to that from the correct view to represent the crosstalk. The crosstalk Ci for ith view position could be expressed as:
where, Bj is the observed intensity from the ith view position when only displaying jth view image. From our measured results, the average crosstalk is about 20%. But to be specific, the crosstalk between the neighboring views which are generated by the steering screen from the same projector is a little larger than the crosstalk between the neighboring views from adjacent projectors. It is because the measured steering angle of the steering screen is 0.95°, which is a little smaller than the designed 1° steering angle. For this type of multi-view 3D displays with high angular resolution, a certain overlap (crosstalk) from neighboring views is required to avoid a black gap between adjacent views and get much smoother motion parallax when the user’s eyes moves horizontally in the viewing range. When the angular resolution of the 3D displays is high, the difference between neighboring views is very tiny which would also reduce the effect of crosstalk. In the future, we could improve this characteristic of steering screen and develop some image processing algorithms to reduce the crosstalk effect further and get better performance of 3D displays.
This paper presents a novel multi-view 3D display with a pico projector array and an angular steering screen that is implemented by utilizing time multiplexing technology. The employment of angular steering screen enables high viewing zone density, significantly improves the angular resolution and simplifies the multi-layer projectors hardware. The 36-view prototype has the ability to provide smooth horizontal parallax and good 3D display performance. In future work, the crosstalk will be reduced, and the contrast will be increased by improving the performance of the steering screen. To enhance the visual experience, a greater number of projectors having higher resolution, will be used to improve both spatial and angular resolution of 3D displays.
National Research Foundation of Singapore under Grant No. NRF-CRP11-2012-01.
References and links
2. N. S. Holliman, N. A. Dodgson, G. E. Favalora, and L. Pockett, “Three-dimensional displays: A review and applications analysis,” IEEE Trans. Broadcast 57(2), 362–371 (2011). [CrossRef]
3. G. E. Favalora, “Volumetric 3D displays and application infrastructure,” Computer 38(8), 37–44 (2005). [CrossRef]
4. N. A. Dodgson, “Autostereoscopic 3D displays,” Computer 38(8), 31–36 (2005). [CrossRef]
5. C. Slinger, C. Cameron, and M. Stanley, “Computer-Generated Holography as a Generic Display Technology,” Computer 38(8), 46–53 (2005). [CrossRef]
6. A. Jones, I. McDowall, H. Yamada, M. Bolas, and P. Debevec, “Rendering for an interactive 360° light field display,” ACM Trans. Graph. 26(3), 40 (2007). [CrossRef]
8. T. Balogh and P. T. Kovács, “Real-time 3D light field transmission,” Proc. SPIE 7724, 772406 (2010). [CrossRef]
9. D. Lanman, G. Wetzstein, M. Hirsch, W. Heidrich, and R. Raskar, “Polarization Fields: Dynamic Light Field Display using Multi-Layer LCDs,” ACM Trans. Graph. 30(6), 186 (2011). [CrossRef]
10. O. S. Cossairt, J. Napoli, S. L. Hill, R. K. Dorval, and G. E. Favalora, “Occlusion-capable multiview volumetric three-dimensional display,” Appl. Opt. 46(8), 1244–1250 (2007). [CrossRef] [PubMed]
11. T. Yendo, T. Fujii, M. Tanimoto, and M. P. Tehrani, “The Seelinder: Cylindrical 3D display viewable from 360 degrees,” J. Vis. Commun. Image. 21(5–6), 586–594 (2010). [CrossRef]
12. X. Xia, Z. Zheng, X. Liu, H. Li, and C. Yan, “Omnidirectional-view three-dimensional display system based on cylindrical selective-diffusing screen,” Appl. Opt. 49(26), 4915–4920 (2010). [CrossRef] [PubMed]
13. Y. Takaki, “High-Density Directional Display for Generating Natural Three-Dimensional Images,” Proc. IEEE 94(3), 654–663 (2006). [CrossRef]
14. H. Nakanuma, H. Kamei, and Y. Takaki, “Natural 3D display with 128 directional images used for human-engineering evaluation,” Proc. SPIE 5664, 28–35 (2005). [CrossRef]
16. B.-R. Lee, J.-Y. Son, H. Lee, S. Yano, H. K. Son, and I. Jeong, “A simulator for a light field display,” Proc. SPIE 10219, 102190D (2017).
17. K. Nagano, A. Jones, J. Liu, J. Busch, X. Yu, M. Bolas, and P. Debevec, “An autostereoscopic projector array optimized for 3D facial display,” ACM SIGGRAPH 2013 Emerging Technologies, July 21–25, 2013, Anaheim, California. [CrossRef]
18. Y. Peng, H. Li, Q. Zhong, X. Xia, and X. Liu, “Large-sized light field three-dimensional display using multi-projectors and directional diffuser,” Opt. Eng. 52(1), 017402 (2013). [CrossRef]
19. J.-H. Lee, J. Park, D. Nam, S. Y. Choi, D.-S. Park, and C. Y. Kim, “Optimal projector configuration design for 300-Mpixel multi-projection 3D display,” Opt. Express 21(22), 26820–26835 (2013). [CrossRef] [PubMed]
20. W. Song, Q. Zhu, Y. Liu, and Y. Wang, “Omnidirectional-view three-dimensional display based on rotating selective-diffusing screen and multiple mini-projectors,” Appl. Opt. 54(13), 4154–4160 (2015). [CrossRef]
21. D. Fattal, Z. Peng, T. Tran, S. Vo, M. Fiorentino, J. Brug, and R. G. Beausoleil, “A multi-directional backlight for a wide-angle, glasses-free three-dimensional display,” Nature 495(7441), 348–351 (2013). [CrossRef] [PubMed]
22. J. Geng, “Design of a single projector multiview 3D display system,” Proc. SPIE 8979, 89790K (2014). [CrossRef]
23. H. Chen, Y. Weng, D. Xu, N. V. Tabiryan, and S. T. Wu, “Beam steering for virtual/augmented reality displays with a cycloidal diffractive waveplate,” Opt. Express 24(7), 7287–7298 (2016). [CrossRef] [PubMed]
25. W. Song, X. Liu, P. Lu, Y. Huang, D. Weng, Y. Zheng, Y. Liu, and Y. Wang, “Design and assessment of a 360° panoramic and high-performance capture system with two tiled catadioptric imaging channels,” Appl. Opt. 57(13), 3429–3437 (2018).
26. H. Wang, O. Yaroshchuk, X. Zhang, Z. Zhuang, P. Surman, X. W. Sun, and Y. Zheng, “Large-aperture transparent beam steering screen based on LCMPA,” Appl. Opt. 55(28), 7824–7829 (2016). [CrossRef]
27. X. Zhang, W. Song, H. Wang, Z. Zhuang, P. Surman, X. W. Sun, and Y. Zheng, “A spatio-temporal multiplexing multi-view display using a lenticular lens and a beam steering screen,” Opt. Commun. 420, 168–173 (2018).
28. P. J. Bos and K. R. Koehler-Beran, “The Pi-Cell: A fast new liquid crystal switching device,” Mol. Cryst. Liq. Cryst. 113, 329–339 (1984).
29. G. Falcao, N. Hurtos, J. Massich, and D. Fofi, “Projector-camera calibration toolbox,” Technical Report, 2009. Available at http://code.google.com/p/procamcalib.
30. X. F. Li, Q. H. Wang, D. H. Li, and A. H. Wang, “Image processing to eliminate crosstalk between neighboring view images in three-dimensional lenticular display,” IEEE J. Display Technol. 7(8), 443–447 (2011).