## Abstract

In integral imaging, the quality of a reconstructed image degrades with increasing viewing angle due to the wavefront aberrations introduced by the lens-array. A wavefront aberration correction method is proposed to enhance the image quality with a pre-filtering function array (PFA). To derive the PFA for an integral imaging display, the wavefront aberration characteristic of the lens-array is analyzed and the intensity distribution of the reconstructed image is calculated based on the wave optics theory. The minimum mean square error method is applied to manipulate the elemental image array (EIA) with a PFA. The validity of the proposed method is confirmed through simulations as well as optical experiments. A 45-degree viewing angle integral imaging display with enhanced image quality is achieved.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

## 1. Introduction

Three-dimensional (3D) imaging technique has attracted great interest for its research and applications, which is based on parallax barriers, lenticular lenses, integral imaging or multi-projectors [1–15]. Integral imaging has attracted many attentions owing to its ability to reconstruct continuous viewing and colorful 3D images with full parallax and all depth cues [5–7]. For a conventional integral imaging system, the light field of a 3D object is captured by a lens-array and an electronic image sensor, thus creating an elemental image array (EIA). To optically reconstruct the 3D object, a second lens-array is employed to project the captured EIA. Various contributions have been made to improve the performance of integral imaging in many aspects including depth range, resolution and viewing angle [8–15]. However, most demonstrated integral imaging displays were analyzed based on paraxial optics relations, without consideration of the lens-array’s aberrations which would degrade the quality of reconstructed 3D scenes. In addition, these displays suffer from more serous aberration effect with increasing viewing angle. The reason is that it is difficult to correct aberrations with a single lens. To resolve this problem, an integral imaging display with three arrays of microlens was proposed to decrease the aberrations, and the lens-arrays were optimized with geometrical optics method and OSLO optical design software [16]. However, the system provides a limited viewing angle and the microlens-arrays require complex assembling process (especially, the control of lens alignment). Zhang et al. designed a monocentric lens-array (MoLA) with fiber bundle to eliminate most of the off-axis aberrations in integral imaging [17]. Still, because the manufacturability of such MoLA remains a challenge, this kind of display is difficult to implement. To our knowledge, previous studies either modified the lens-arrays or changed the system configurations to reduce the aberration effect. In order to reconstruct a clear 3D image in a wide viewing angle, a more complicated lens-array is required, thus making the system harder to manufacture and assembly.

Here, a pre-filtering function array (PFA) is proposed to reduce aberration effect in the optical reconstruction process of integral imaging. Eighty-one field sections are chosen to express the wavefront aberration of the lens-array in the Zernike polynomial mode. The light intensity distribution of the reconstructed image is analyzed based on the Fresnel diffraction theory. The captured EIA is pre-corrected with the PFA to counter the effect of the residual wavefront aberrations which are introduced by the lens-array. The use of PFA does not increase the complexity of the display. With the proposed wavefront aberration correction approach, a full-parallax 3D scene with high image quality is perceived in a wide viewing angle.

## 2. Proposed wavefront aberration correction method

As shown in Fig. 1, the reconstruction process of an enhanced 3D image based on the proposed method can be separated into three steps: the capturing stage, the pre-filtering stage and the optical reconstruction stage. In the first step, the light field of 3D object is captured as a form of 2D EIA through a camera array. In the second step, the EIA is pre-corrected with a PFA and thus creates a pre-filtering elemental image array (PEIA). During the pre-filtering stage, the field of each elemental image (EI) is divided into 81 sections based on the representative field positions to calculate the aberrations of the lens. With wavefront aberrations of each field section, the pre-filtering function (PF) is derived from optical analysis. Then, a pre-filtering elemental image (PEI) is generated by convolving the EI with the PF. In the third step, the PEIA is loaded into an optical integral imaging display and a 3D image with high quality is reconstructed.

#### 2.1 Wavefront aberration characteristic of the lens-array

In an integral imaging display, an aberration-free EIA can be obtained by computer-generated technique [18,19], where a virtual camera array is used to pick up the EIA of a 3D object. An aberration-free EIA is required in our proposed method, so that the quality of the reconstructed 3D scenes won’t be degraded by the captured blur images. Being presented on a display panel, each aberration-free EI is imaged by its corresponding lens for optical reconstruction. For an ideal lens, it allows all light rays originating from a pixel, which can be regarded as an object point, to meet again at exactly one point on the reconstructed image. In practice, however, the light rays emanating from one point of the EI are refracted by an imperfect lens and distributed over an area around the ideal reconstructed point due to the wavefront aberrations of the lens. Consequently, it is essential to analyze the wavefront aberration characteristic of the lens-array in integral imaging. Figure 2(a) shows the optical reconstruction of the volume pixel${A}^{\text{'}}$in the integral imaging display. The lens-array is assumed to be composed of (2M + 1)$\times $(2N + 1) lenses. The spherical wavefronts (red line) produced by a set of object points$\left\{{A}_{M0},\cdots {A}_{10},{A}_{00},{A}_{-10},\cdots {A}_{-M0}\right\}$on the EIA turn into the aberrated wavefronts after refracted by the corresponding lenses. Figures 2(b) and 2(c) show the ideal wavefront map and the actual wavefront map, respectively. The deviation of the actual wavefront (red line) from the desired ideal wavefront (black dotted line) is called wavefront aberration, whose curve is plotted in Fig. 2(d). Generally, the wavefront aberrations increase as the field angle becomes greater. When a set of object points$\left\{{A}_{M0},\cdots {A}_{10},{A}_{00},{A}_{-10},\cdots {A}_{-M0}\right\}$are imaged with imperfect lenses, the reconstructed volume pixel${A}^{\text{'}}$turns a blurred spot. To calculate the wavefront aberrations of the lens-array, we assume that the central lens and the central EI are denoted as the 0th lens (${L}_{00}$) and the 0th EI ($E{I}_{00}$), respectively. A coordinate system$\xi \eta z$where the$z$axis coincides with the optical axis of the${L}_{00}$is set up. The origin of the coordinate is further assumed to be the center of${L}_{00}$. Plane coordinate systems$\xi \eta $and${x}_{0}{y}_{0}$are used to describe the lens-array plane($z=0$) and the EIA plane($z=-l$), respectively. Since the lenses in the lens-array are identical,${L}_{00}$is taken as an example to express the wavefront aberrations of one lens mathematically. The wavefront aberrations of${L}_{00}$are given by

According to Eq. (1), ${W}_{00}$is parameterized by the object field coordinate$({x}_{0},{y}_{0})$, which means that the wavefront aberrations vary in different object fields. To characterize these spatially varying aberrations of the lens${L}_{00}$, the field of $E{I}_{00}$is divided into a few sections and thus the aberrations within each section can be treated as shift invariant [20]. Here, the representative filed positions of $0$,$\pm 0.3{H}_{m}$,$\pm 0.5{H}_{m}$,$\pm 0.7{H}_{m}$,$\pm 0.9{H}_{m}$are sampled along${x}_{0}$and${y}_{0}$directions. Accordingly, the field of$E{I}_{00}$is divided into 81 equal subsections ($9\times 9$sections), denoted by${S}_{n}(n=1,2,\dots ,81)$, as shown in Fig. 3. Since the change of the aberrations in each section is little, the aberrations are regarded as shift-invariant, and they can be calculated with the corresponding representative field positions. For example, the wavefront aberration in section${S}_{5}$is calculated with the position $(0,0.9{H}_{m})$. Under such assumption, the mapping relation between the field positions and the section${S}_{1}~{S}_{81}$is given as:

Further, the wavefront aberrations of an arbitrary lens in the lens-array can be expressed as

where$m=-M,-\left(M-1\right),\mathrm{...},0,\mathrm{...},M-1,M$and$n=-N,-\left(N-1\right),\mathrm{...},0,\mathrm{...},N-1,N$are the indexes of the lens in the lens-array.$p$represents the lens pitch.#### 2.2 Light intensity distribution of the reconstructed image

After the wavefront aberration expression of the lens-array is obtained, the light intensity distributions of an image point and a reconstructed image in the integral imaging display can be analyzed.

Figure 4(a) demonstrates the integral imaging display including a liquid crystal display (LCD), a lens-array and a holographic functional screen (HFS). The LCD is used to display EIA and the lens-array is used to project the EIA onto the HFS. The HFS is a reception screen which diffuses the incident light in a solid diffusing angle and the light distribution from the lens-array is re-modulated by HFS to guarantee a uniform 3D image [22,23]. Plane${x}_{0}{y}_{0}$, plane$\xi \eta $and plane$xy$represent the LCD, the lens-array and the HFS, respectively. Assuming that a single pixel${A}_{mn}$(orange spot in Fig. 4(a)) is situated at$\left({x}_{0}^{{A}_{mn}},{y}_{0}^{{A}_{mn}}\right)$on the$E{I}_{mn}$(green rectangle in Fig. 4(a)). It can be regarded as a point source and expressed as the Dirac delta function. Spherical waves emanating from point${A}_{mn}$is refracted through the corresponding lens${L}_{mn}$(red circle in Fig. 4(a)) and then emerge as aberrated waves converging to the image point${A}^{\text{'}}$. During the wave propagation process, the${L}_{mn}$introduces a phase delay and wavefront aberrations. Based on the Fresnel diffraction theory, the intensity distribution of the image point${A}^{\text{'}}$on the HFS can be therefore given by:

The intensity distribution of$E{I}_{mn}$on the HFS can be represented through a convolution operation:

where${I}_{mn}\left({x}_{0},{y}_{0};z\right)$represents the intensity distribution of$E{I}_{mn}$and$R{I}_{mn}\left(x,y;z\right)$denotes the intensity distribution of the corresponding image. Symbol$\ast $denotes the convolution. Further, the intensity distribution of the resulting 3D image can be obtained#### 2.3 EIA pre-filtered with PFA

This subsection aims to reduce aberration effect in integral imaging by pre-filtering the EIA. In the optical reconstruction, the process that the EIA is imaged by the lens-array is a convolution calculation between the array of${I}_{mn}\left({x}_{0},{y}_{0};z\right)$and the array of$h\left(x,y;z\right)$. During this process, waveftont aberrations are introduced by$h\left(x,y;z\right)$. Thus, the aberration effect can be reduced by convolving every EI with the corresponding PF,${h}_{}^{-1}(x,y;z)$,that is defined as the inverse function of$h(x,y;z)$. The EIA is manipulated with the PFA and then is imaged by the lens-array, which is given by:

## 3. Simulation and experiment

In order to confirm the validity of the proposed method, the corresponding simulation and optical experiment are implemented.

The parameters of the lens unit consisting two lenses in employed lens-array are listed in Table 1. Because the lens unit is rotationally symmetric, its spot diagrams in 25 filed sections (the same field sections in Fig. 3(b)) are shown in Fig. 5. We can see that serious wavefront aberrations of the lens unit occur. The Zernike coefficients of field sections${S}_{1}$~${S}_{81}$are given in Table 2. With these Zernike coefficients,$h\left(x,y;z\right)$is calculated and the pre-corrected image is created according to Eq. (11). Figure 6 shows the wavefront aberrations without correction and with correction at different field positions. After the correction, the wavefront aberrations in the whole field zone are decreased. Note that, the wavefront aberrations in the marginal field (22.5 degree) is reduced from 24.6 waves to 11.6 waves, which makes it possible to implement an integral imaging display with a large viewing angle.

The display process of an EI is simulated to confirm the validity of the proposed method. The original EI are shown in Fig. 7(a), and Figs. 7(b) and 7(c) show the displayed effect without correction and with correction, respectively. Comparing Fig. 7(a) with Fig. 7(b), the image quality is degraded due to the wavefont aberrations of the lens unit. From Figs. 7(b) and 7(c), we can see that the image quality is noticeably improved because of decreased wavefront aberrations. The image quality measured in PSNR for the images without and with correction is shown in Fig. 7 (bottom). The PSNR values of enhanced images are increased (first row: from 31.3dB to 34.1dB, second row: from 29.7dB to 32.3dB), which indicates that aberration effect is reduced with the proposed method.

The optical integral imaging system for the reconstructed 3D image is experimentally demonstrated. The prototype of the display system consisting of a LCD, a lens-array, and a HFS is shown in Fig. 8. The lens-array is mounted 17.39$mm$away from the LCD. The HFS is located at a distance of 200$mm$in front of the lens-array. The EIA on the 23.6 inch LCD with the resolution of 3840$\times $2160 is captured by the virtual cameras in Blender software. Each EIA is composed of 88$\times $88 pixels in a matrix format from 88$\times $88 viewpoint perspectives. Different perspectives are integrated as a 3D image with the 46$\times $23 lens-array assisted with the HFS. Figure 9(a) shows the EIA captured by the virtual camera array, and Fig. 9(b) is the PEIA which is pre-corrected by the PFA.

Figure 10 shows the reconstructed 3D images in different viewing angles. The first row shows the reconstructed 3D image without correction, and the bottom row shows the enhanced 3D image by displaying the PEIA. In this figure, we have prepared two movies in which we show frame by frame set of images captured in each direction. We can see that the bottom images offer more detail information about the 3D scene and present clearer reconstructed images within 45 degree viewing angles compared with the top images. A clear and full-parallax 3D image with 30-cm displayed depth and 7744 viewpoints is perceived.

## 4. Conclusion

In summary, a wavefront aberration correction method for integral imaging is presented. This method analyzes the wavefront aberration characteristic of the lens-array and the intensity distribution of reconstructed 3D scenes so as to obtain the PFA, which is used for pre-correcting the EIA to be displayed on an integral imaging optical display. Simulations demonstrate that the RMS wavefont aberration is reduced and the image quality measured in PSNR is increased by more than 2dB. With our approach, the integral imaging experimental system presents full-parallax 3D images with high image quality in 45-degree viewing angle.

## Funding

The National Key Research and Development Program (2017YFB1002900); National Natural Science Foundation of China (61575025); Fund of the State Key Laboratory of Information Photonics and Optical Communications (IPOC2017ZZ02); Fundamental Research Funds for the Central Universities (2018PTB-00-01).

## References

**1. **J. Y. Luo, Q. H. Wang, W. X. Zhao, and D. H. Li, “Autostereoscopic three-dimensional display based on two parallax barriers,” Appl. Opt. **50**(18), 2911–2915 (2011). [CrossRef] [PubMed]

**2. **W. X. Zhao, Q. H. Wang, A. H. Wang, and D. H. Li, “Autostereoscopic display based on two-layer lenticular lenses,” Opt. Lett. **35**(24), 4127–4129 (2010). [CrossRef] [PubMed]

**3. **X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. **52**(4), 546–560 (2013). [CrossRef] [PubMed]

**4. **D. Fattal, Z. Peng, T. Tran, S. Vo, M. Fiorentino, J. Brug, and R. G. Beausoleil, “A multi-directional backlight for a wide-angle, glasses-free three-dimensional display,” Nature **495**(7441), 348–351 (2013). [CrossRef] [PubMed]

**5. **G. Lippmann, “La photographie integrale,” C. R. Acad. Sci. **146**, 446–451 (1908).

**6. **H. E. Ives, “Optical properties of a Lippmann lenticulated sheet,” J. Opt. Soc. Am. **21**(3), 171–176 (1931). [CrossRef]

**7. **J. Geng, “Three-dimensional display technologies,” Adv. Opt. Photonics **5**(4), 456–535 (2013). [CrossRef] [PubMed]

**8. **A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE **94**(3), 591–607 (2006). [CrossRef]

**9. **Á. Tolosa, R. Martinez-Cuenca, H. Navarro, G. Saavedra, M. Martínez-Corral, B. Javidi, and A. Pons, “Enhanced field-of-view integral imaging display using multi-Köhler illumination,” Opt. Express **22**(26), 31853–31863 (2014). [CrossRef] [PubMed]

**10. **C. W. Chen, M. Cho, Y. P. Huang, and B. Javidi, “Improved viewing zones for projection type integral imaging 3D display using adaptive liquid crystal prism array,” J. Disp. Technol. **10**(3), 198–203 (2014). [CrossRef]

**11. **J. Y. Jang, D. Shin, and E. S. Kim, “Optical three-dimensional refocusing from elemental images based on a sifting property of the periodic δ-function array in integral-imaging,” Opt. Express **22**(2), 1533–1550 (2014). [CrossRef] [PubMed]

**12. **J. S. Jang and B. Javidi, “Improved viewing resolution of three-dimensional integral imaging by use of nonstationary micro-optics,” Opt. Lett. **27**(5), 324–326 (2002). [CrossRef] [PubMed]

**13. **X. B. Dong, L. Y. Ai, and E. S. Kim, “Integral imaging-based large-scale full-color 3-D display of holographic data by using a commercial LCD panel,” Opt. Express **24**(4), 3638–3651 (2016). [CrossRef] [PubMed]

**14. **H. H. Kang, J. H. Lee, and E. S. Kim, “Enhanced compression rate of integral images by using motion-compensated residual images in three-dimensional integral-imaging,” Opt. Express **20**(5), 5440–5459 (2012). [CrossRef] [PubMed]

**15. **Y. Kim, J. Kim, J. M. Kang, J. H. Jung, H. Choi, and B. Lee, “Point light source integral imaging with improved resolution and viewing angle by the use of electrically movable pinhole array,” Opt. Express **15**(26), 18253–18267 (2007). [CrossRef] [PubMed]

**16. **A. Karimzadeh, “Integral imaging system optical design with aberration consideration,” Appl. Opt. **54**(7), 1765–1769 (2015). [CrossRef]

**17. **J. Zhang, X. Wang, X. Wu, C. Yang, and Y. Chen, “Wide-viewing integral imaging using fiber-coupled monocentric lens array,” Opt. Express **23**(18), 23339–23347 (2015). [CrossRef] [PubMed]

**18. **Y. Igarashi, H. Murata, and M. Ueda, “3D display system using a computer generated integral photography,” Jpn. J. Appl. Phys. **17**(9), 1683–1684 (1978). [CrossRef]

**19. **S. Xing, X. Sang, X. Yu, C. Duo, B. Pang, X. Gao, S. Yang, Y. Guan, B. Yan, J. Yuan, and K. Wang, “High-efficient computer-generated integral imaging based on the backward ray-tracing technique and optical reconstruction,” Opt. Express **25**(1), 330–338 (2017). [CrossRef] [PubMed]

**20. **H. Gross, W. Singer, M. Totzeck, F. Blechinger, and B. Achtner, *Handbook of Optical Systems* (Wiley Online Library, 2005).

**21. **L. Jiang, X. Zhang, F. Fang, X. Liu, and L. Zhu, “Wavefront aberration metrology based on transmitted fringe deflectometry,” Appl. Opt. **56**(26), 7396–7403 (2017). [CrossRef] [PubMed]

**22. **C. Yu, J. Yuan, F. C. Fan, C. C. Jiang, S. Choi, X. Sang, C. Lin, and D. Xu, “The modulation function and realizing method of holographic functional screen,” Opt. Express **18**(26), 27820–27826 (2010). [CrossRef] [PubMed]

**23. **X. Sang, X. Gao, X. Yu, S. Xing, Y. Li, and Y. Wu, “Interactive floating full-parallax digital three-dimensional light-field display based on wavefront recomposing,” Opt. Express **26**(7), 8883–8889 (2018). [CrossRef] [PubMed]

**24. **R. C. Gonzales and R. E. Wood, *Digital Image Processing* (Prentice Hall, 2002).