Abstract
In integral imaging, the quality of a reconstructed image degrades with increasing viewing angle due to the wavefront aberrations introduced by the lens-array. A wavefront aberration correction method is proposed to enhance the image quality with a pre-filtering function array (PFA). To derive the PFA for an integral imaging display, the wavefront aberration characteristic of the lens-array is analyzed and the intensity distribution of the reconstructed image is calculated based on the wave optics theory. The minimum mean square error method is applied to manipulate the elemental image array (EIA) with a PFA. The validity of the proposed method is confirmed through simulations as well as optical experiments. A 45-degree viewing angle integral imaging display with enhanced image quality is achieved.
© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement
1. Introduction
Three-dimensional (3D) imaging technique has attracted great interest for its research and applications, which is based on parallax barriers, lenticular lenses, integral imaging or multi-projectors [1–15]. Integral imaging has attracted many attentions owing to its ability to reconstruct continuous viewing and colorful 3D images with full parallax and all depth cues [5–7]. For a conventional integral imaging system, the light field of a 3D object is captured by a lens-array and an electronic image sensor, thus creating an elemental image array (EIA). To optically reconstruct the 3D object, a second lens-array is employed to project the captured EIA. Various contributions have been made to improve the performance of integral imaging in many aspects including depth range, resolution and viewing angle [8–15]. However, most demonstrated integral imaging displays were analyzed based on paraxial optics relations, without consideration of the lens-array’s aberrations which would degrade the quality of reconstructed 3D scenes. In addition, these displays suffer from more serous aberration effect with increasing viewing angle. The reason is that it is difficult to correct aberrations with a single lens. To resolve this problem, an integral imaging display with three arrays of microlens was proposed to decrease the aberrations, and the lens-arrays were optimized with geometrical optics method and OSLO optical design software [16]. However, the system provides a limited viewing angle and the microlens-arrays require complex assembling process (especially, the control of lens alignment). Zhang et al. designed a monocentric lens-array (MoLA) with fiber bundle to eliminate most of the off-axis aberrations in integral imaging [17]. Still, because the manufacturability of such MoLA remains a challenge, this kind of display is difficult to implement. To our knowledge, previous studies either modified the lens-arrays or changed the system configurations to reduce the aberration effect. In order to reconstruct a clear 3D image in a wide viewing angle, a more complicated lens-array is required, thus making the system harder to manufacture and assembly.
Here, a pre-filtering function array (PFA) is proposed to reduce aberration effect in the optical reconstruction process of integral imaging. Eighty-one field sections are chosen to express the wavefront aberration of the lens-array in the Zernike polynomial mode. The light intensity distribution of the reconstructed image is analyzed based on the Fresnel diffraction theory. The captured EIA is pre-corrected with the PFA to counter the effect of the residual wavefront aberrations which are introduced by the lens-array. The use of PFA does not increase the complexity of the display. With the proposed wavefront aberration correction approach, a full-parallax 3D scene with high image quality is perceived in a wide viewing angle.
2. Proposed wavefront aberration correction method
As shown in Fig. 1, the reconstruction process of an enhanced 3D image based on the proposed method can be separated into three steps: the capturing stage, the pre-filtering stage and the optical reconstruction stage. In the first step, the light field of 3D object is captured as a form of 2D EIA through a camera array. In the second step, the EIA is pre-corrected with a PFA and thus creates a pre-filtering elemental image array (PEIA). During the pre-filtering stage, the field of each elemental image (EI) is divided into 81 sections based on the representative field positions to calculate the aberrations of the lens. With wavefront aberrations of each field section, the pre-filtering function (PF) is derived from optical analysis. Then, a pre-filtering elemental image (PEI) is generated by convolving the EI with the PF. In the third step, the PEIA is loaded into an optical integral imaging display and a 3D image with high quality is reconstructed.
2.1 Wavefront aberration characteristic of the lens-array
In an integral imaging display, an aberration-free EIA can be obtained by computer-generated technique [18,19], where a virtual camera array is used to pick up the EIA of a 3D object. An aberration-free EIA is required in our proposed method, so that the quality of the reconstructed 3D scenes won’t be degraded by the captured blur images. Being presented on a display panel, each aberration-free EI is imaged by its corresponding lens for optical reconstruction. For an ideal lens, it allows all light rays originating from a pixel, which can be regarded as an object point, to meet again at exactly one point on the reconstructed image. In practice, however, the light rays emanating from one point of the EI are refracted by an imperfect lens and distributed over an area around the ideal reconstructed point due to the wavefront aberrations of the lens. Consequently, it is essential to analyze the wavefront aberration characteristic of the lens-array in integral imaging. Figure 2(a) shows the optical reconstruction of the volume pixelin the integral imaging display. The lens-array is assumed to be composed of (2M + 1)(2N + 1) lenses. The spherical wavefronts (red line) produced by a set of object pointson the EIA turn into the aberrated wavefronts after refracted by the corresponding lenses. Figures 2(b) and 2(c) show the ideal wavefront map and the actual wavefront map, respectively. The deviation of the actual wavefront (red line) from the desired ideal wavefront (black dotted line) is called wavefront aberration, whose curve is plotted in Fig. 2(d). Generally, the wavefront aberrations increase as the field angle becomes greater. When a set of object pointsare imaged with imperfect lenses, the reconstructed volume pixelturns a blurred spot. To calculate the wavefront aberrations of the lens-array, we assume that the central lens and the central EI are denoted as the 0th lens () and the 0th EI (), respectively. A coordinate systemwhere theaxis coincides with the optical axis of theis set up. The origin of the coordinate is further assumed to be the center of. Plane coordinate systemsandare used to describe the lens-array plane() and the EIA plane(), respectively. Since the lenses in the lens-array are identical,is taken as an example to express the wavefront aberrations of one lens mathematically. The wavefront aberrations ofare given by
where coordinatedenotes the pupil coordinate of the lens. and, which are referred to as object field coordinates, represent the lateral coordinate and vertical coordinate of the object point on the, respectively. , , , , , and are the coefficients of the third-order aberrations. , and denote the defocus aberrations. represents spherical aberration, the sum ofandrepresents coma, the sum of,,andrepresents astigmatism and curvature of field, and the sum of anddenotes distortion.According to Eq. (1), is parameterized by the object field coordinate, which means that the wavefront aberrations vary in different object fields. To characterize these spatially varying aberrations of the lens, the field of is divided into a few sections and thus the aberrations within each section can be treated as shift invariant [20]. Here, the representative filed positions of ,,,,are sampled alonganddirections. Accordingly, the field ofis divided into 81 equal subsections (sections), denoted by, as shown in Fig. 3. Since the change of the aberrations in each section is little, the aberrations are regarded as shift-invariant, and they can be calculated with the corresponding representative field positions. For example, the wavefront aberration in sectionis calculated with the position . Under such assumption, the mapping relation between the field positions and the sectionis given as:
whereis the maximum field,represents the length of an EI, anddenotes the filed section domain. After two object coordinate variables (and) are transformed into the fixed object positions, the Zernike polynomial mode is used to express the wavefront aberrations of the:whererepresents the index of the field sections~. denote Zernike polynomial coefficients in field section, and each item corresponds to the coefficient of constant term, y-tilt, x-tilt, defocus, y-astigmatism, x-astigmatism, y-coma, x-coma and spherical aberration, respectively. Here, the Zernike polynomial coefficients contain the object field information and can be measured by wavefront sensing techniques [21].denotes object field coordinates of a random pixelin. Each field section has its own set of Zernike polynomial coefficient, which meansvaries with different field section.Further, the wavefront aberrations of an arbitrary lens in the lens-array can be expressed as
whereandare the indexes of the lens in the lens-array.represents the lens pitch.2.2 Light intensity distribution of the reconstructed image
After the wavefront aberration expression of the lens-array is obtained, the light intensity distributions of an image point and a reconstructed image in the integral imaging display can be analyzed.
Figure 4(a) demonstrates the integral imaging display including a liquid crystal display (LCD), a lens-array and a holographic functional screen (HFS). The LCD is used to display EIA and the lens-array is used to project the EIA onto the HFS. The HFS is a reception screen which diffuses the incident light in a solid diffusing angle and the light distribution from the lens-array is re-modulated by HFS to guarantee a uniform 3D image [22,23]. Plane, planeand planerepresent the LCD, the lens-array and the HFS, respectively. Assuming that a single pixel(orange spot in Fig. 4(a)) is situated aton the(green rectangle in Fig. 4(a)). It can be regarded as a point source and expressed as the Dirac delta function. Spherical waves emanating from pointis refracted through the corresponding lens(red circle in Fig. 4(a)) and then emerge as aberrated waves converging to the image point. During the wave propagation process, theintroduces a phase delay and wavefront aberrations. Based on the Fresnel diffraction theory, the intensity distribution of the image pointon the HFS can be therefore given by:
where,are the integration domains, which are limited by the size of theand the size of the, respectively.denotes the distance between the LCD and the lens-array,represents the distance between the lens-array and the HFS, andis the focal length of the. The values of,andare related by the lens law.is the wave number andis the wavelength.represents the pupil function of the. Characterized by its wavefront aberrations,,is given aswheredenotes the pitch of the, which is limited by a circular area due to its round shape. Since it is the intensity that will be perceived by the viewer, the phase term in Eq. (5) can be omitted. Thus, Eq. (5) simplifies towhere,denotes the magnification of the system and is given by. Here, Eq. (7) implies that the intensity distribution of the image point is affected by the pupil function, which is determined by the wavefront aberrations of the. Therefore, it can be deduced that the calculation ofis related to the. If, the image point is an Airy disk. If , the image point becomes a blurred spot. Figures 4(b) and 4(c) plot the intensity distribution and RMS spot radius of spot. As illustrated in Fig. 4(b), the intensity distribution of spotis much broader than that of an ideal spot. Figure 4(c) also shows that the RMS spot radius ofis larger than that of the ideal spot.The intensity distribution ofon the HFS can be represented through a convolution operation:
whererepresents the intensity distribution ofanddenotes the intensity distribution of the corresponding image. Symboldenotes the convolution. Further, the intensity distribution of the resulting 3D image can be obtainedAccording to Eqs. (7) and (9), the quality of the reconstructed 3D image is determined by the wavefront aberrations of the lens-array.2.3 EIA pre-filtered with PFA
This subsection aims to reduce aberration effect in integral imaging by pre-filtering the EIA. In the optical reconstruction, the process that the EIA is imaged by the lens-array is a convolution calculation between the array ofand the array of. During this process, waveftont aberrations are introduced by. Thus, the aberration effect can be reduced by convolving every EI with the corresponding PF,,that is defined as the inverse function of. The EIA is manipulated with the PFA and then is imaged by the lens-array, which is given by:
In Eq. (10), is the intensity of a PEIA. With the minimum mean square error method [24], the intensity of PEIA is modified as:where FT denotes the Fourier transform of a variable, andis a constant which is related to the estimation noise. Therefore, through the inverse filtering operation between the EIA and the PFA, EIA is pre-corrected in a way that is opposite to the effect of the residual wavefront aberrations introduced by the lens-array.3. Simulation and experiment
In order to confirm the validity of the proposed method, the corresponding simulation and optical experiment are implemented.
The parameters of the lens unit consisting two lenses in employed lens-array are listed in Table 1. Because the lens unit is rotationally symmetric, its spot diagrams in 25 filed sections (the same field sections in Fig. 3(b)) are shown in Fig. 5. We can see that serious wavefront aberrations of the lens unit occur. The Zernike coefficients of field sections~are given in Table 2. With these Zernike coefficients,is calculated and the pre-corrected image is created according to Eq. (11). Figure 6 shows the wavefront aberrations without correction and with correction at different field positions. After the correction, the wavefront aberrations in the whole field zone are decreased. Note that, the wavefront aberrations in the marginal field (22.5 degree) is reduced from 24.6 waves to 11.6 waves, which makes it possible to implement an integral imaging display with a large viewing angle.
The display process of an EI is simulated to confirm the validity of the proposed method. The original EI are shown in Fig. 7(a), and Figs. 7(b) and 7(c) show the displayed effect without correction and with correction, respectively. Comparing Fig. 7(a) with Fig. 7(b), the image quality is degraded due to the wavefont aberrations of the lens unit. From Figs. 7(b) and 7(c), we can see that the image quality is noticeably improved because of decreased wavefront aberrations. The image quality measured in PSNR for the images without and with correction is shown in Fig. 7 (bottom). The PSNR values of enhanced images are increased (first row: from 31.3dB to 34.1dB, second row: from 29.7dB to 32.3dB), which indicates that aberration effect is reduced with the proposed method.
The optical integral imaging system for the reconstructed 3D image is experimentally demonstrated. The prototype of the display system consisting of a LCD, a lens-array, and a HFS is shown in Fig. 8. The lens-array is mounted 17.39away from the LCD. The HFS is located at a distance of 200in front of the lens-array. The EIA on the 23.6 inch LCD with the resolution of 38402160 is captured by the virtual cameras in Blender software. Each EIA is composed of 8888 pixels in a matrix format from 8888 viewpoint perspectives. Different perspectives are integrated as a 3D image with the 4623 lens-array assisted with the HFS. Figure 9(a) shows the EIA captured by the virtual camera array, and Fig. 9(b) is the PEIA which is pre-corrected by the PFA.
Figure 10 shows the reconstructed 3D images in different viewing angles. The first row shows the reconstructed 3D image without correction, and the bottom row shows the enhanced 3D image by displaying the PEIA. In this figure, we have prepared two movies in which we show frame by frame set of images captured in each direction. We can see that the bottom images offer more detail information about the 3D scene and present clearer reconstructed images within 45 degree viewing angles compared with the top images. A clear and full-parallax 3D image with 30-cm displayed depth and 7744 viewpoints is perceived.
4. Conclusion
In summary, a wavefront aberration correction method for integral imaging is presented. This method analyzes the wavefront aberration characteristic of the lens-array and the intensity distribution of reconstructed 3D scenes so as to obtain the PFA, which is used for pre-correcting the EIA to be displayed on an integral imaging optical display. Simulations demonstrate that the RMS wavefont aberration is reduced and the image quality measured in PSNR is increased by more than 2dB. With our approach, the integral imaging experimental system presents full-parallax 3D images with high image quality in 45-degree viewing angle.
Funding
The National Key Research and Development Program (2017YFB1002900); National Natural Science Foundation of China (61575025); Fund of the State Key Laboratory of Information Photonics and Optical Communications (IPOC2017ZZ02); Fundamental Research Funds for the Central Universities (2018PTB-00-01).
References
1. J. Y. Luo, Q. H. Wang, W. X. Zhao, and D. H. Li, “Autostereoscopic three-dimensional display based on two parallax barriers,” Appl. Opt. 50(18), 2911–2915 (2011). [CrossRef] [PubMed]
2. W. X. Zhao, Q. H. Wang, A. H. Wang, and D. H. Li, “Autostereoscopic display based on two-layer lenticular lenses,” Opt. Lett. 35(24), 4127–4129 (2010). [CrossRef] [PubMed]
3. X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. 52(4), 546–560 (2013). [CrossRef] [PubMed]
4. D. Fattal, Z. Peng, T. Tran, S. Vo, M. Fiorentino, J. Brug, and R. G. Beausoleil, “A multi-directional backlight for a wide-angle, glasses-free three-dimensional display,” Nature 495(7441), 348–351 (2013). [CrossRef] [PubMed]
5. G. Lippmann, “La photographie integrale,” C. R. Acad. Sci. 146, 446–451 (1908).
6. H. E. Ives, “Optical properties of a Lippmann lenticulated sheet,” J. Opt. Soc. Am. 21(3), 171–176 (1931). [CrossRef]
7. J. Geng, “Three-dimensional display technologies,” Adv. Opt. Photonics 5(4), 456–535 (2013). [CrossRef] [PubMed]
8. A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006). [CrossRef]
9. Á. Tolosa, R. Martinez-Cuenca, H. Navarro, G. Saavedra, M. Martínez-Corral, B. Javidi, and A. Pons, “Enhanced field-of-view integral imaging display using multi-Köhler illumination,” Opt. Express 22(26), 31853–31863 (2014). [CrossRef] [PubMed]
10. C. W. Chen, M. Cho, Y. P. Huang, and B. Javidi, “Improved viewing zones for projection type integral imaging 3D display using adaptive liquid crystal prism array,” J. Disp. Technol. 10(3), 198–203 (2014). [CrossRef]
11. J. Y. Jang, D. Shin, and E. S. Kim, “Optical three-dimensional refocusing from elemental images based on a sifting property of the periodic δ-function array in integral-imaging,” Opt. Express 22(2), 1533–1550 (2014). [CrossRef] [PubMed]
12. J. S. Jang and B. Javidi, “Improved viewing resolution of three-dimensional integral imaging by use of nonstationary micro-optics,” Opt. Lett. 27(5), 324–326 (2002). [CrossRef] [PubMed]
13. X. B. Dong, L. Y. Ai, and E. S. Kim, “Integral imaging-based large-scale full-color 3-D display of holographic data by using a commercial LCD panel,” Opt. Express 24(4), 3638–3651 (2016). [CrossRef] [PubMed]
14. H. H. Kang, J. H. Lee, and E. S. Kim, “Enhanced compression rate of integral images by using motion-compensated residual images in three-dimensional integral-imaging,” Opt. Express 20(5), 5440–5459 (2012). [CrossRef] [PubMed]
15. Y. Kim, J. Kim, J. M. Kang, J. H. Jung, H. Choi, and B. Lee, “Point light source integral imaging with improved resolution and viewing angle by the use of electrically movable pinhole array,” Opt. Express 15(26), 18253–18267 (2007). [CrossRef] [PubMed]
16. A. Karimzadeh, “Integral imaging system optical design with aberration consideration,” Appl. Opt. 54(7), 1765–1769 (2015). [CrossRef]
17. J. Zhang, X. Wang, X. Wu, C. Yang, and Y. Chen, “Wide-viewing integral imaging using fiber-coupled monocentric lens array,” Opt. Express 23(18), 23339–23347 (2015). [CrossRef] [PubMed]
18. Y. Igarashi, H. Murata, and M. Ueda, “3D display system using a computer generated integral photography,” Jpn. J. Appl. Phys. 17(9), 1683–1684 (1978). [CrossRef]
19. S. Xing, X. Sang, X. Yu, C. Duo, B. Pang, X. Gao, S. Yang, Y. Guan, B. Yan, J. Yuan, and K. Wang, “High-efficient computer-generated integral imaging based on the backward ray-tracing technique and optical reconstruction,” Opt. Express 25(1), 330–338 (2017). [CrossRef] [PubMed]
20. H. Gross, W. Singer, M. Totzeck, F. Blechinger, and B. Achtner, Handbook of Optical Systems (Wiley Online Library, 2005).
21. L. Jiang, X. Zhang, F. Fang, X. Liu, and L. Zhu, “Wavefront aberration metrology based on transmitted fringe deflectometry,” Appl. Opt. 56(26), 7396–7403 (2017). [CrossRef] [PubMed]
22. C. Yu, J. Yuan, F. C. Fan, C. C. Jiang, S. Choi, X. Sang, C. Lin, and D. Xu, “The modulation function and realizing method of holographic functional screen,” Opt. Express 18(26), 27820–27826 (2010). [CrossRef] [PubMed]
23. X. Sang, X. Gao, X. Yu, S. Xing, Y. Li, and Y. Wu, “Interactive floating full-parallax digital three-dimensional light-field display based on wavefront recomposing,” Opt. Express 26(7), 8883–8889 (2018). [CrossRef] [PubMed]
24. R. C. Gonzales and R. E. Wood, Digital Image Processing (Prentice Hall, 2002).