Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Wavefront aberration correction for integral imaging with the pre-filtering function array

Open Access Open Access

Abstract

In integral imaging, the quality of a reconstructed image degrades with increasing viewing angle due to the wavefront aberrations introduced by the lens-array. A wavefront aberration correction method is proposed to enhance the image quality with a pre-filtering function array (PFA). To derive the PFA for an integral imaging display, the wavefront aberration characteristic of the lens-array is analyzed and the intensity distribution of the reconstructed image is calculated based on the wave optics theory. The minimum mean square error method is applied to manipulate the elemental image array (EIA) with a PFA. The validity of the proposed method is confirmed through simulations as well as optical experiments. A 45-degree viewing angle integral imaging display with enhanced image quality is achieved.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Three-dimensional (3D) imaging technique has attracted great interest for its research and applications, which is based on parallax barriers, lenticular lenses, integral imaging or multi-projectors [1–15]. Integral imaging has attracted many attentions owing to its ability to reconstruct continuous viewing and colorful 3D images with full parallax and all depth cues [5–7]. For a conventional integral imaging system, the light field of a 3D object is captured by a lens-array and an electronic image sensor, thus creating an elemental image array (EIA). To optically reconstruct the 3D object, a second lens-array is employed to project the captured EIA. Various contributions have been made to improve the performance of integral imaging in many aspects including depth range, resolution and viewing angle [8–15]. However, most demonstrated integral imaging displays were analyzed based on paraxial optics relations, without consideration of the lens-array’s aberrations which would degrade the quality of reconstructed 3D scenes. In addition, these displays suffer from more serous aberration effect with increasing viewing angle. The reason is that it is difficult to correct aberrations with a single lens. To resolve this problem, an integral imaging display with three arrays of microlens was proposed to decrease the aberrations, and the lens-arrays were optimized with geometrical optics method and OSLO optical design software [16]. However, the system provides a limited viewing angle and the microlens-arrays require complex assembling process (especially, the control of lens alignment). Zhang et al. designed a monocentric lens-array (MoLA) with fiber bundle to eliminate most of the off-axis aberrations in integral imaging [17]. Still, because the manufacturability of such MoLA remains a challenge, this kind of display is difficult to implement. To our knowledge, previous studies either modified the lens-arrays or changed the system configurations to reduce the aberration effect. In order to reconstruct a clear 3D image in a wide viewing angle, a more complicated lens-array is required, thus making the system harder to manufacture and assembly.

Here, a pre-filtering function array (PFA) is proposed to reduce aberration effect in the optical reconstruction process of integral imaging. Eighty-one field sections are chosen to express the wavefront aberration of the lens-array in the Zernike polynomial mode. The light intensity distribution of the reconstructed image is analyzed based on the Fresnel diffraction theory. The captured EIA is pre-corrected with the PFA to counter the effect of the residual wavefront aberrations which are introduced by the lens-array. The use of PFA does not increase the complexity of the display. With the proposed wavefront aberration correction approach, a full-parallax 3D scene with high image quality is perceived in a wide viewing angle.

2. Proposed wavefront aberration correction method

As shown in Fig. 1, the reconstruction process of an enhanced 3D image based on the proposed method can be separated into three steps: the capturing stage, the pre-filtering stage and the optical reconstruction stage. In the first step, the light field of 3D object is captured as a form of 2D EIA through a camera array. In the second step, the EIA is pre-corrected with a PFA and thus creates a pre-filtering elemental image array (PEIA). During the pre-filtering stage, the field of each elemental image (EI) is divided into 81 sections based on the representative field positions to calculate the aberrations of the lens. With wavefront aberrations of each field section, the pre-filtering function (PF) is derived from optical analysis. Then, a pre-filtering elemental image (PEI) is generated by convolving the EI with the PF. In the third step, the PEIA is loaded into an optical integral imaging display and a 3D image with high quality is reconstructed.

 figure: Fig. 1

Fig. 1 Block diagram of the proposed system: (a) capturing stage, (b) pre-filtering stage, (c) optical reconstruction stage.

Download Full Size | PDF

2.1 Wavefront aberration characteristic of the lens-array

In an integral imaging display, an aberration-free EIA can be obtained by computer-generated technique [18,19], where a virtual camera array is used to pick up the EIA of a 3D object. An aberration-free EIA is required in our proposed method, so that the quality of the reconstructed 3D scenes won’t be degraded by the captured blur images. Being presented on a display panel, each aberration-free EI is imaged by its corresponding lens for optical reconstruction. For an ideal lens, it allows all light rays originating from a pixel, which can be regarded as an object point, to meet again at exactly one point on the reconstructed image. In practice, however, the light rays emanating from one point of the EI are refracted by an imperfect lens and distributed over an area around the ideal reconstructed point due to the wavefront aberrations of the lens. Consequently, it is essential to analyze the wavefront aberration characteristic of the lens-array in integral imaging. Figure 2(a) shows the optical reconstruction of the volume pixelA'in the integral imaging display. The lens-array is assumed to be composed of (2M + 1)×(2N + 1) lenses. The spherical wavefronts (red line) produced by a set of object points{AM0,A10,A00,A10,AM0}on the EIA turn into the aberrated wavefronts after refracted by the corresponding lenses. Figures 2(b) and 2(c) show the ideal wavefront map and the actual wavefront map, respectively. The deviation of the actual wavefront (red line) from the desired ideal wavefront (black dotted line) is called wavefront aberration, whose curve is plotted in Fig. 2(d). Generally, the wavefront aberrations increase as the field angle becomes greater. When a set of object points{AM0,A10,A00,A10,AM0}are imaged with imperfect lenses, the reconstructed volume pixelA'turns a blurred spot. To calculate the wavefront aberrations of the lens-array, we assume that the central lens and the central EI are denoted as the 0th lens (L00) and the 0th EI (EI00), respectively. A coordinate systemξηzwhere thezaxis coincides with the optical axis of theL00is set up. The origin of the coordinate is further assumed to be the center ofL00. Plane coordinate systemsξηandx0y0are used to describe the lens-array plane(z=0) and the EIA plane(z=l), respectively. Since the lenses in the lens-array are identical,L00is taken as an example to express the wavefront aberrations of one lens mathematically. The wavefront aberrations ofL00are given by

W00=W00(ξ2+η2,ηy0,y02,ξx0,x02)=a1(ξ2+η2)+a2ηy0+a2ξx0+b1(ξ2+η2)2+b2ηy0(ξ2+η2)+b2ξx0(ξ2+η2)+b3η2y02+b3ξ2x02+b4y02(ξ2+η2)+b4x02(ξ2+η2)+b5ηy03+b5ξx03
where coordinate(ξ,η)denotes the pupil coordinate of the lensL00. x0andy0, which are referred to as object field coordinates, represent the lateral coordinate and vertical coordinate of the object point on theEI00, respectively. a1, a2, b1, b2, b3, b4and b5are the coefficients of the third-order aberrations. a1(ξ2+η2), a2ηy0 and a2ξx0 denote the defocus aberrations. b1(ξ2+η2)2represents spherical aberration, the sum ofb2ηy0(ξ2+η2)andb2ξx0(ξ2+η2)represents coma, the sum ofb3η2y02,b3ξ2x02,b4y02(ξ2+η2)andb4x02(ξ2+η2)represents astigmatism and curvature of field, and the sum of b5ηy03andb5ξx03denotes distortion.

 figure: Fig. 2

Fig. 2 (a) Reconstruction process of the volume pixelA'. (b) Ideal wavefront map; (c) distorted wavefront map. (d) RMS wavefront aberration of the imperfect lens.

Download Full Size | PDF

According to Eq. (1), W00is parameterized by the object field coordinate(x0,y0), which means that the wavefront aberrations vary in different object fields. To characterize these spatially varying aberrations of the lensL00, the field of EI00is divided into a few sections and thus the aberrations within each section can be treated as shift invariant [20]. Here, the representative filed positions of 0,±0.3Hm,±0.5Hm,±0.7Hm,±0.9Hmare sampled alongx0andy0directions. Accordingly, the field ofEI00is divided into 81 equal subsections (9×9sections), denoted bySn(n=1,2,,81), as shown in Fig. 3. Since the change of the aberrations in each section is little, the aberrations are regarded as shift-invariant, and they can be calculated with the corresponding representative field positions. For example, the wavefront aberration in sectionS5is calculated with the position (0,0.9Hm). Under such assumption, the mapping relation between the field positions and the sectionS1~S81is given as:

(x0,y0)={(0.9Hm,0.9Hm),(x0,y0)ΩS1(0.7Hm,0.9Hm),(x0,y0)ΩS2(0.7Hm,0.9Hm),(x0,y0)ΩS8(0.9Hm,0.9Hm),(x0,y0)ΩS9,(x0,y0)={(0.9Hm,0.7Hm),(x0,y0)ΩS10(0.7Hm,0.7Hm),(x0,y0)ΩS11(0.7Hm,0.7Hm),(x0,y0)ΩS17(0.9Hm,0.7Hm),(x0,y0)ΩS18(x0,y0)={(0.9Hm,0.7Hm),(x0,y0)ΩS64(0.7Hm,0.7Hm),(x0,y0)ΩS65(0.7Hm,0.7Hm),(x0,y0)ΩS71(0.9Hm,0.7Hm),(x0,y0)ΩS72,(x0,y0)={(0.9Hm,0.9Hm),(x0,y0)ΩS73(0.7Hm,0.9Hm),(x0,y0)ΩS74(0.7Hm,0.9Hm),(x0,y0)ΩS80(0.9Hm,0.9Hm),(x0,y0)ΩS81
whereHm=H2is the maximum field,Hrepresents the length of an EI, andΩdenotes the filed section domain. After two object coordinate variables (x0andy0) are transformed into the fixed object positions, the Zernike polynomial mode is used to express the wavefront aberrations of theL00:
W00(ξ,η)=W[x](ξ,η)=Z0[x]+Z1[x]η+Z2[x]ξ+Z3[x][2(ξ2+η2)1]+Z4[x](η2ξ2)+Z5[x]2ξη+Z6[x][2η+3η(ξ2+η2)]+Z7[x][2ξ+3ξ(ξ2+η2)]+Z8[x][16(ξ2+η2)+6(ξ2+η2)2]subjectto(x0A00,y0A00)Ωx,forx=S1,S2S80,S81
wherexrepresents the index of the field sectionsS1~S81. Z0[x]~Z8[x]denote Zernike polynomial coefficients in field sectionx, and each item corresponds to the coefficient of constant term, y-tilt, x-tilt, defocus, y-astigmatism, x-astigmatism, y-coma, x-coma and spherical aberration, respectively. Here, the Zernike polynomial coefficients contain the object field information and can be measured by wavefront sensing techniques [21].(x0A00,y0A00)denotes object field coordinates of a random pixelA00inEI00. Each field section has its own set of Zernike polynomial coefficientZ0~Z8, which meansW00varies with different field section.

 figure: Fig. 3

Fig. 3 (a) 81 representative field sections of theEI00. (b) 25 field sections in the upper-right corner of Fig. 3(a) and the corresponding field positions (these field sections are plotted, for the lens is rotationally symmetric).

Download Full Size | PDF

Further, the wavefront aberrations of an arbitrary lens in the lens-array can be expressed as

Wmn(ξ,η)=W00(ξpm,ηpn)
wherem=M,(M1),...,0,...,M1,Mandn=N,(N1),...,0,...,N1,Nare the indexes of the lens in the lens-array.prepresents the lens pitch.

2.2 Light intensity distribution of the reconstructed image

After the wavefront aberration expression of the lens-array is obtained, the light intensity distributions of an image point and a reconstructed image in the integral imaging display can be analyzed.

Figure 4(a) demonstrates the integral imaging display including a liquid crystal display (LCD), a lens-array and a holographic functional screen (HFS). The LCD is used to display EIA and the lens-array is used to project the EIA onto the HFS. The HFS is a reception screen which diffuses the incident light in a solid diffusing angle and the light distribution from the lens-array is re-modulated by HFS to guarantee a uniform 3D image [22,23]. Planex0y0, planeξηand planexyrepresent the LCD, the lens-array and the HFS, respectively. Assuming that a single pixelAmn(orange spot in Fig. 4(a)) is situated at(x0Amn,y0Amn)on theEImn(green rectangle in Fig. 4(a)). It can be regarded as a point source and expressed as the Dirac delta function. Spherical waves emanating from pointAmnis refracted through the corresponding lensLmn(red circle in Fig. 4(a)) and then emerge as aberrated waves converging to the image pointA'. During the wave propagation process, theLmnintroduces a phase delay and wavefront aberrations. Based on the Fresnel diffraction theory, the intensity distribution of the image pointA'on the HFS can be therefore given by:

h(x,y;z)=|1λ2lgΩEIΩLδ(x0x0Amn,y0y0Amn)×exp{ik2l[(ξx0Amn)2+(ηy0Amn)2]}×Pmn(ξ,η)×exp{ik2f[(ξpm)2+(ηpn)2]}×exp{ik2g[(xξ)2+(yη)2]}dx0dy0dξdη|2
whereΩEI,ΩLare the integration domains, which are limited by the size of theEImnand the size of theLmn, respectively.ldenotes the distance between the LCD and the lens-array,grepresents the distance between the lens-array and the HFS, andfis the focal length of theLmn. The values ofl,gandfare related by the lens law1/g+1/l=1/f.k=2π/λis the wave number andλis the wavelength.Pmn(ξ,η)represents the pupil function of theLmn. Characterized by its wavefront aberrations,Wmn,Pmnis given as
Pmn(ξ,η)={exp[ikWmn(ξ,η)](ξpm)2+(ηpn)2(p/2)20otherwise
wherepdenotes the pitch of theLmn, which is limited by a circular area due to its round shape. Since it is the intensity that will be perceived by the viewer, the phase term in Eq. (5) can be omitted. Thus, Eq. (5) simplifies to
h(x,y;z)=|M+exp[ikWmn(λgu,λgv)]exp{i2π[(xMx0Amn)u+(yMy0Amn)v]}dudv|2
whereu=ξλg,v=ηλg,Mdenotes the magnification of the system and is given byM=g/l. Here, Eq. (7) implies that the intensity distribution of the image point is affected by the pupil functionPmn(λgu,λgv), which is determined by the wavefront aberrations of theLmn. Therefore, it can be deduced that the calculation ofh(x,y;z)is related to theWmn(ξ,η). IfWmn=0, the image point is an Airy disk. If Wmn0, the image point becomes a blurred spot. Figures 4(b) and 4(c) plot the intensity distribution and RMS spot radius of spotA'. As illustrated in Fig. 4(b), the intensity distribution of spotA'is much broader than that of an ideal spot. Figure 4(c) also shows that the RMS spot radius ofA'is larger than that of the ideal spot.

 figure: Fig. 4

Fig. 4 (a) Imaging process of a pixel in the integral imaging display. (b) Intensity distribution of spotA'vs. intensity distribution of an ideal spot. (c) RMS spot ofA'(central, RMS radius = 184.8um) vs. RMS spot of an ideal spot (right top corner, RMS radius = 0).

Download Full Size | PDF

The intensity distribution ofEImnon the HFS can be represented through a convolution operation:

RImn(x,y;z)=Imn(x0,y0;z)h(x,y;z)
whereImn(x0,y0;z)represents the intensity distribution ofEImnandRImn(x,y;z)denotes the intensity distribution of the corresponding image. Symboldenotes the convolution. Further, the intensity distribution of the resulting 3D image can be obtained
RI(x,y;z)=m=M,n=Nm=M,n=NRImn(x,y;z)=m=M,n=Nm=M,n=NImn(x0,y0;z)h(x,y;z)
According to Eqs. (7) and (9), the quality of the reconstructed 3D image is determined by the wavefront aberrations of the lens-array.

2.3 EIA pre-filtered with PFA

This subsection aims to reduce aberration effect in integral imaging by pre-filtering the EIA. In the optical reconstruction, the process that the EIA is imaged by the lens-array is a convolution calculation between the array ofImn(x0,y0;z)and the array ofh(x,y;z). During this process, waveftont aberrations are introduced byh(x,y;z). Thus, the aberration effect can be reduced by convolving every EI with the corresponding PF,h1(x,y;z),that is defined as the inverse function ofh(x,y;z). The EIA is manipulated with the PFA and then is imaged by the lens-array, which is given by:

RI(x,y;z)=m=M,n=Nm=M,n=NPImn(x0,y0;z)h(x,y;z)=m=M,n=Nm=M,n=NImn(x0,y0;z)h1(x,y;z)h(x,y;z)
In Eq. (10), m=M,n=Nm=M,n=NPImn(x0,y0;z)is the intensity of a PEIA. With the minimum mean square error method [24], the intensity of PEIA is modified as:
m=M,n=Nm=M,n=NPImn(x,y;z)=m=M,n=Nm=M,n=N1FT[h(x,y;z)]|FT[h(x,y;z)]|2|FT[h(x,y;z)]|2+KFT[Imn(x,y;z)]
where FT denotes the Fourier transform of a variable, andKis a constant which is related to the estimation noise. Therefore, through the inverse filtering operation between the EIA and the PFA, EIA is pre-corrected in a way that is opposite to the effect of the residual wavefront aberrations introduced by the lens-array.

3. Simulation and experiment

In order to confirm the validity of the proposed method, the corresponding simulation and optical experiment are implemented.

The parameters of the lens unit consisting two lenses in employed lens-array are listed in Table 1. Because the lens unit is rotationally symmetric, its spot diagrams in 25 filed sections (the same field sections in Fig. 3(b)) are shown in Fig. 5. We can see that serious wavefront aberrations of the lens unit occur. The Zernike coefficients of field sectionsS1~S81are given in Table 2. With these Zernike coefficients,h(x,y;z)is calculated and the pre-corrected image is created according to Eq. (11). Figure 6 shows the wavefront aberrations without correction and with correction at different field positions. After the correction, the wavefront aberrations in the whole field zone are decreased. Note that, the wavefront aberrations in the marginal field (22.5 degree) is reduced from 24.6 waves to 11.6 waves, which makes it possible to implement an integral imaging display with a large viewing angle.

Tables Icon

Table 1. Surface Specifications of the lens unit.

 figure: Fig. 5

Fig. 5 Spot diagrams of the lens unit in 25 field sections (RMS spot size: “S5”: 3.18mm, “S6”: 3.42mm, “S7”: 3.68mm, “S8”: 3.88mm, “S9”: 0.97mm, “S14”: 2.34mm, “S15”: 2.59mm, “S16”: 3.04mm, “S17”: 3.68mm, “S18”: 4.55mm, “S23”: 1.70mm, “S24”: 1.94mm, “S25”: 2.41mm, “S26”: 3.04mm, “S27”: 3.88mm, “S32”: 1.22mm, “S33”: 1.47mm, “S34”: 1.94mm, “S35”: 2.59mm, “S36”: 3.42mm, “S41”: 0.97mm, “S42”: 1.22mm, “S43”: 1.69mm, “S44”: 2.34mm, “S45”: 3.18mm).

Download Full Size | PDF

Tables Icon

Table 2. Zernike coefficients of field sections S1~S81.

 figure: Fig. 6

Fig. 6 RMS wavefront aberrations without correction and with correction.

Download Full Size | PDF

The display process of an EI is simulated to confirm the validity of the proposed method. The original EI are shown in Fig. 7(a), and Figs. 7(b) and 7(c) show the displayed effect without correction and with correction, respectively. Comparing Fig. 7(a) with Fig. 7(b), the image quality is degraded due to the wavefont aberrations of the lens unit. From Figs. 7(b) and 7(c), we can see that the image quality is noticeably improved because of decreased wavefront aberrations. The image quality measured in PSNR for the images without and with correction is shown in Fig. 7 (bottom). The PSNR values of enhanced images are increased (first row: from 31.3dB to 34.1dB, second row: from 29.7dB to 32.3dB), which indicates that aberration effect is reduced with the proposed method.

 figure: Fig. 7

Fig. 7 The simulation with the proposed method: (a) original images, (b) reconstructed images without correction, (c) reconstructed images with correction.

Download Full Size | PDF

The optical integral imaging system for the reconstructed 3D image is experimentally demonstrated. The prototype of the display system consisting of a LCD, a lens-array, and a HFS is shown in Fig. 8. The lens-array is mounted 17.39mmaway from the LCD. The HFS is located at a distance of 200mmin front of the lens-array. The EIA on the 23.6 inch LCD with the resolution of 3840×2160 is captured by the virtual cameras in Blender software. Each EIA is composed of 88×88 pixels in a matrix format from 88×88 viewpoint perspectives. Different perspectives are integrated as a 3D image with the 46×23 lens-array assisted with the HFS. Figure 9(a) shows the EIA captured by the virtual camera array, and Fig. 9(b) is the PEIA which is pre-corrected by the PFA.

 figure: Fig. 8

Fig. 8 Optical configuration of the integral imaging display.

Download Full Size | PDF

 figure: Fig. 9

Fig. 9 Two kinds of EIAs: (a) Captured EIA, (b) PEIA.

Download Full Size | PDF

Figure 10 shows the reconstructed 3D images in different viewing angles. The first row shows the reconstructed 3D image without correction, and the bottom row shows the enhanced 3D image by displaying the PEIA. In this figure, we have prepared two movies in which we show frame by frame set of images captured in each direction. We can see that the bottom images offer more detail information about the 3D scene and present clearer reconstructed images within 45 degree viewing angles compared with the top images. A clear and full-parallax 3D image with 30-cm displayed depth and 7744 viewpoints is perceived.

 figure: Fig. 10

Fig. 10 Image quality of the reconstructed 3D object is improved. The top row shows different views of the displayed 3D image with the captured EIA (see Visualization 1). The bottom row shows different views of the displayed 3D image with the PEIA (see Visualization 2).

Download Full Size | PDF

4. Conclusion

In summary, a wavefront aberration correction method for integral imaging is presented. This method analyzes the wavefront aberration characteristic of the lens-array and the intensity distribution of reconstructed 3D scenes so as to obtain the PFA, which is used for pre-correcting the EIA to be displayed on an integral imaging optical display. Simulations demonstrate that the RMS wavefont aberration is reduced and the image quality measured in PSNR is increased by more than 2dB. With our approach, the integral imaging experimental system presents full-parallax 3D images with high image quality in 45-degree viewing angle.

Funding

The National Key Research and Development Program (2017YFB1002900); National Natural Science Foundation of China (61575025); Fund of the State Key Laboratory of Information Photonics and Optical Communications (IPOC2017ZZ02); Fundamental Research Funds for the Central Universities (2018PTB-00-01).

References

1. J. Y. Luo, Q. H. Wang, W. X. Zhao, and D. H. Li, “Autostereoscopic three-dimensional display based on two parallax barriers,” Appl. Opt. 50(18), 2911–2915 (2011). [CrossRef]   [PubMed]  

2. W. X. Zhao, Q. H. Wang, A. H. Wang, and D. H. Li, “Autostereoscopic display based on two-layer lenticular lenses,” Opt. Lett. 35(24), 4127–4129 (2010). [CrossRef]   [PubMed]  

3. X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [Invited],” Appl. Opt. 52(4), 546–560 (2013). [CrossRef]   [PubMed]  

4. D. Fattal, Z. Peng, T. Tran, S. Vo, M. Fiorentino, J. Brug, and R. G. Beausoleil, “A multi-directional backlight for a wide-angle, glasses-free three-dimensional display,” Nature 495(7441), 348–351 (2013). [CrossRef]   [PubMed]  

5. G. Lippmann, “La photographie integrale,” C. R. Acad. Sci. 146, 446–451 (1908).

6. H. E. Ives, “Optical properties of a Lippmann lenticulated sheet,” J. Opt. Soc. Am. 21(3), 171–176 (1931). [CrossRef]  

7. J. Geng, “Three-dimensional display technologies,” Adv. Opt. Photonics 5(4), 456–535 (2013). [CrossRef]   [PubMed]  

8. A. Stern and B. Javidi, “Three-dimensional image sensing, visualization, and processing using integral imaging,” Proc. IEEE 94(3), 591–607 (2006). [CrossRef]  

9. Á. Tolosa, R. Martinez-Cuenca, H. Navarro, G. Saavedra, M. Martínez-Corral, B. Javidi, and A. Pons, “Enhanced field-of-view integral imaging display using multi-Köhler illumination,” Opt. Express 22(26), 31853–31863 (2014). [CrossRef]   [PubMed]  

10. C. W. Chen, M. Cho, Y. P. Huang, and B. Javidi, “Improved viewing zones for projection type integral imaging 3D display using adaptive liquid crystal prism array,” J. Disp. Technol. 10(3), 198–203 (2014). [CrossRef]  

11. J. Y. Jang, D. Shin, and E. S. Kim, “Optical three-dimensional refocusing from elemental images based on a sifting property of the periodic δ-function array in integral-imaging,” Opt. Express 22(2), 1533–1550 (2014). [CrossRef]   [PubMed]  

12. J. S. Jang and B. Javidi, “Improved viewing resolution of three-dimensional integral imaging by use of nonstationary micro-optics,” Opt. Lett. 27(5), 324–326 (2002). [CrossRef]   [PubMed]  

13. X. B. Dong, L. Y. Ai, and E. S. Kim, “Integral imaging-based large-scale full-color 3-D display of holographic data by using a commercial LCD panel,” Opt. Express 24(4), 3638–3651 (2016). [CrossRef]   [PubMed]  

14. H. H. Kang, J. H. Lee, and E. S. Kim, “Enhanced compression rate of integral images by using motion-compensated residual images in three-dimensional integral-imaging,” Opt. Express 20(5), 5440–5459 (2012). [CrossRef]   [PubMed]  

15. Y. Kim, J. Kim, J. M. Kang, J. H. Jung, H. Choi, and B. Lee, “Point light source integral imaging with improved resolution and viewing angle by the use of electrically movable pinhole array,” Opt. Express 15(26), 18253–18267 (2007). [CrossRef]   [PubMed]  

16. A. Karimzadeh, “Integral imaging system optical design with aberration consideration,” Appl. Opt. 54(7), 1765–1769 (2015). [CrossRef]  

17. J. Zhang, X. Wang, X. Wu, C. Yang, and Y. Chen, “Wide-viewing integral imaging using fiber-coupled monocentric lens array,” Opt. Express 23(18), 23339–23347 (2015). [CrossRef]   [PubMed]  

18. Y. Igarashi, H. Murata, and M. Ueda, “3D display system using a computer generated integral photography,” Jpn. J. Appl. Phys. 17(9), 1683–1684 (1978). [CrossRef]  

19. S. Xing, X. Sang, X. Yu, C. Duo, B. Pang, X. Gao, S. Yang, Y. Guan, B. Yan, J. Yuan, and K. Wang, “High-efficient computer-generated integral imaging based on the backward ray-tracing technique and optical reconstruction,” Opt. Express 25(1), 330–338 (2017). [CrossRef]   [PubMed]  

20. H. Gross, W. Singer, M. Totzeck, F. Blechinger, and B. Achtner, Handbook of Optical Systems (Wiley Online Library, 2005).

21. L. Jiang, X. Zhang, F. Fang, X. Liu, and L. Zhu, “Wavefront aberration metrology based on transmitted fringe deflectometry,” Appl. Opt. 56(26), 7396–7403 (2017). [CrossRef]   [PubMed]  

22. C. Yu, J. Yuan, F. C. Fan, C. C. Jiang, S. Choi, X. Sang, C. Lin, and D. Xu, “The modulation function and realizing method of holographic functional screen,” Opt. Express 18(26), 27820–27826 (2010). [CrossRef]   [PubMed]  

23. X. Sang, X. Gao, X. Yu, S. Xing, Y. Li, and Y. Wu, “Interactive floating full-parallax digital three-dimensional light-field display based on wavefront recomposing,” Opt. Express 26(7), 8883–8889 (2018). [CrossRef]   [PubMed]  

24. R. C. Gonzales and R. E. Wood, Digital Image Processing (Prentice Hall, 2002).

Supplementary Material (2)

NameDescription
Visualization 1       the displayed 3D image with the captured EIA (without correction)
Visualization 2       the displayed 3D image with the PEIA (with correction)

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1
Fig. 1 Block diagram of the proposed system: (a) capturing stage, (b) pre-filtering stage, (c) optical reconstruction stage.
Fig. 2
Fig. 2 (a) Reconstruction process of the volume pixel A ' . (b) Ideal wavefront map; (c) distorted wavefront map. (d) RMS wavefront aberration of the imperfect lens.
Fig. 3
Fig. 3 (a) 81 representative field sections of the E I 00 . (b) 25 field sections in the upper-right corner of Fig. 3(a) and the corresponding field positions (these field sections are plotted, for the lens is rotationally symmetric).
Fig. 4
Fig. 4 (a) Imaging process of a pixel in the integral imaging display. (b) Intensity distribution of spot A ' vs. intensity distribution of an ideal spot. (c) RMS spot of A ' (central, RMS radius = 184.8um) vs. RMS spot of an ideal spot (right top corner, RMS radius = 0).
Fig. 5
Fig. 5 Spot diagrams of the lens unit in 25 field sections (RMS spot size: “S5”: 3.18mm, “S6”: 3.42mm, “S7”: 3.68mm, “S8”: 3.88mm, “S9”: 0.97mm, “S14”: 2.34mm, “S15”: 2.59mm, “S16”: 3.04mm, “S17”: 3.68mm, “S18”: 4.55mm, “S23”: 1.70mm, “S24”: 1.94mm, “S25”: 2.41mm, “S26”: 3.04mm, “S27”: 3.88mm, “S32”: 1.22mm, “S33”: 1.47mm, “S34”: 1.94mm, “S35”: 2.59mm, “S36”: 3.42mm, “S41”: 0.97mm, “S42”: 1.22mm, “S43”: 1.69mm, “S44”: 2.34mm, “S45”: 3.18mm).
Fig. 6
Fig. 6 RMS wavefront aberrations without correction and with correction.
Fig. 7
Fig. 7 The simulation with the proposed method: (a) original images, (b) reconstructed images without correction, (c) reconstructed images with correction.
Fig. 8
Fig. 8 Optical configuration of the integral imaging display.
Fig. 9
Fig. 9 Two kinds of EIAs: (a) Captured EIA, (b) PEIA.
Fig. 10
Fig. 10 Image quality of the reconstructed 3D object is improved. The top row shows different views of the displayed 3D image with the captured EIA (see Visualization 1). The bottom row shows different views of the displayed 3D image with the PEIA (see Visualization 2).

Tables (2)

Tables Icon

Table 1 Surface Specifications of the lens unit.

Tables Icon

Table 2 Zernike coefficients of field sections S1~S81.

Equations (11)

Equations on this page are rendered with MathJax. Learn more.

W 00 = W 00 ( ξ 2 + η 2 , η y 0 , y 0 2 , ξ x 0 , x 0 2 ) = a 1 ( ξ 2 + η 2 ) + a 2 η y 0 + a 2 ξ x 0 + b 1 ( ξ 2 + η 2 ) 2 + b 2 η y 0 ( ξ 2 + η 2 ) + b 2 ξ x 0 ( ξ 2 + η 2 ) + b 3 η 2 y 0 2 + b 3 ξ 2 x 0 2 + b 4 y 0 2 ( ξ 2 + η 2 ) + b 4 x 0 2 ( ξ 2 + η 2 ) + b 5 η y 0 3 + b 5 ξ x 0 3
( x 0 , y 0 ) = { ( 0.9 H m , 0.9 H m ) , ( x 0 , y 0 ) Ω S 1 ( 0.7 H m , 0.9 H m ) , ( x 0 , y 0 ) Ω S 2 ( 0.7 H m , 0.9 H m ) , ( x 0 , y 0 ) Ω S 8 ( 0.9 H m , 0.9 H m ) , ( x 0 , y 0 ) Ω S 9 , ( x 0 , y 0 ) = { ( 0.9 H m , 0.7 H m ) , ( x 0 , y 0 ) Ω S 10 ( 0.7 H m , 0.7 H m ) , ( x 0 , y 0 ) Ω S 11 ( 0.7 H m , 0.7 H m ) , ( x 0 , y 0 ) Ω S 17 ( 0.9 H m , 0.7 H m ) , ( x 0 , y 0 ) Ω S 18 ( x 0 , y 0 ) = { ( 0.9 H m , 0.7 H m ) , ( x 0 , y 0 ) Ω S 64 ( 0.7 H m , 0.7 H m ) , ( x 0 , y 0 ) Ω S 65 ( 0.7 H m , 0.7 H m ) , ( x 0 , y 0 ) Ω S 71 ( 0.9 H m , 0.7 H m ) , ( x 0 , y 0 ) Ω S 72 , ( x 0 , y 0 ) = { ( 0.9 H m , 0.9 H m ) , ( x 0 , y 0 ) Ω S 73 ( 0.7 H m , 0.9 H m ) , ( x 0 , y 0 ) Ω S 74 ( 0.7 H m , 0.9 H m ) , ( x 0 , y 0 ) Ω S 80 ( 0.9 H m , 0.9 H m ) , ( x 0 , y 0 ) Ω S 81
W 00 ( ξ , η ) = W [ x ] ( ξ , η ) = Z 0 [ x ] + Z 1 [ x ] η + Z 2 [ x ] ξ + Z 3 [ x ] [ 2 ( ξ 2 + η 2 ) 1 ] + Z 4 [ x ] ( η 2 ξ 2 ) + Z 5 [ x ] 2 ξ η + Z 6 [ x ] [ 2 η + 3 η ( ξ 2 + η 2 ) ] + Z 7 [ x ] [ 2 ξ + 3 ξ ( ξ 2 + η 2 ) ] + Z 8 [ x ] [ 1 6 ( ξ 2 + η 2 ) + 6 ( ξ 2 + η 2 ) 2 ] s u b j e c t t o ( x 0 A 00 , y 0 A 00 ) Ω x , f o r x = S 1 , S 2 S 80 , S 81
W m n ( ξ , η ) = W 00 ( ξ p m , η p n )
h ( x , y ; z ) = | 1 λ 2 l g Ω E I Ω L δ ( x 0 x 0 A m n , y 0 y 0 A m n ) × exp { i k 2 l [ ( ξ x 0 A m n ) 2 + ( η y 0 A m n ) 2 ] } × P m n ( ξ , η ) × exp { i k 2 f [ ( ξ p m ) 2 + ( η p n ) 2 ] } × exp { i k 2 g [ ( x ξ ) 2 + ( y η ) 2 ] } d x 0 d y 0 d ξ d η | 2
P m n ( ξ , η ) = { exp [ i k W m n ( ξ , η ) ] ( ξ p m ) 2 + ( η p n ) 2 ( p / 2 ) 2 0 o t h e r w i s e
h ( x , y ; z ) = | M + exp [ i k W m n ( λ g u , λ g v ) ] exp { i 2 π [ ( x M x 0 A m n ) u + ( y M y 0 A m n ) v ] } d u d v | 2
R I m n ( x , y ; z ) = I m n ( x 0 , y 0 ; z ) h ( x , y ; z )
R I ( x , y ; z ) = m = M , n = N m = M , n = N R I m n ( x , y ; z ) = m = M , n = N m = M , n = N I m n ( x 0 , y 0 ; z ) h ( x , y ; z )
R I ( x , y ; z ) = m = M , n = N m = M , n = N P I m n ( x 0 , y 0 ; z ) h ( x , y ; z ) = m = M , n = N m = M , n = N I m n ( x 0 , y 0 ; z ) h 1 ( x , y ; z ) h ( x , y ; z )
m = M , n = N m = M , n = N P I m n ( x , y ; z ) = m = M , n = N m = M , n = N 1 F T [ h ( x , y ; z ) ] | F T [ h ( x , y ; z ) ] | 2 | F T [ h ( x , y ; z ) ] | 2 + K F T [ I m n ( x , y ; z ) ]
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.