Abstract

We propose an enhanced three-dimensional integral imaging system using a curved lens array. Incorporation of a curved lens array instead of a conventional flat lens array expands the viewing angle remarkably. The flipped images are eliminated effectively by adopting barriers. The principle of the proposed system is explained and the experimental results are also presented.

©2004 Optical Society of America

1. Introduction and motivation

The integral imaging (InIm), which uses a lens array and a planar display panel, is in the spotlight as an autostereoscopic three-dimensional (3D) display method for its many advantages compared with other techniques [16]. However, a primary disadvantage of InIm is its narrow viewing angle. The viewing angle, the angle within which observers can see the complete image reconstructed by InIm, is limited due to the restriction of the area where each elemental image can be displayed. Generally, in the InIm system each elemental lens has its corresponding area on the display panel. To prevent the image flipping the elemental image that exceeds the corresponding area is discarded optically in direct pick up method [6] or electrically in computer-generated integral imaging (CGII) method [7]. Therefore the number of the elemental images is limited and an observer outside the viewing zone cannot see the integrated image (see Fig. 1).

 figure: Fig. 1.

Fig. 1. Limitation of viewing angle: observer-1 is in viewing zone and observer-2 is out of the viewing zone.

Download Full Size | PPT Slide | PDF

To overcome the limitation of viewing angle some methods have been studied. One method is using a Fresnel lens array that has a small f-number to widen the viewing angle [8]. However, there is limitation in decreasing the f-number and there occurs the lens aberration. Another method is using lens switching to enhance the viewing angle by doubling the region of each elemental image [9]. This approach, however, needs a mechanical mask that should move fast enough for the after-image effect, which causes some problems such as air resistance and noise. To implement the dynamic mask without mechanical movement, another method using orthogonal polarization switching was proposed [10]. This method uses orthogonally polarized elemental images with polarization shutter screen and the orthogonal polarization sheet attached to the lens array. The disadvantage of this method is that the intensity of original image is reduced by half with using the polarization sheet and the integrated image becomes dim. The method which uses volume holographic recording of the elemental images [11] has been proposed also. However, the method cannot implement dynamic color display. Another method which uses moving lenslet arrays with low fill factor [12] was also proposed, but the system needs mechanical movement. Another method is the idea that is using multiple display devices [13, 14]. However, the structure is bulky and tested only for the double-device case.

In this paper, we theoretically discuss and experimentally prove a new InIm system using a curved lens array to expand the viewing angle. In the proposed system we curve the conventional flat lens array to have a certain radius of curvature, and use a conventional display panel. We also redefine the region of elemental image corresponding to each elemental lens. As a result, the conventional viewing angle analysis should be quite modified and the proposed system can widen the viewing angle.

2. Principle of the proposed method

Figure 2(a) shows the configuration of the proposed InIm system and each elemental image region. The lens array is curved laterally with a uniform radius of curvature. According to the curvature of the lens array, the region of the elemental image corresponding to each elemental lens is redefined. We draw straight lines that join the center of the curvature circle of the lens array and the edges of each elemental lens. The lines intersect the display panel. We suggest that the area of each elemental image be the area between the intersections as shown in Fig 2(a). Thus the area of elemental image becomes larger as the corresponding elemental lens becomes farther from the central axis. Figure 2(b) shows examples of the computer-generated elemental images for the conventional flat lens array system and the proposed curved lens array system. In the calculation for elemental images for the curved lens array CGII system, we use ray optics in a reverse manner. That is, for a point in the object to be integrated, we follow an imaginary ray that originates from the object point and goes through the center of each lens and meets the display panel. The meeting point on the display panel is the elemental image point corresponding to the object point. We perform this process for all lenses and all points in the object. The square blocks in Fig. 2(b) indicate the sizes of elemental image regions. We can see that in the curved lens array case the size of elemental image becomes larger in width for the corresponding elemental lens located horizontally farther from the center axis, while the sizes of elemental image regions are all the same in the flat lens array case.

 figure: Fig. 2.

Fig. 2. (a) Configuration of the curved lens array system (b) computer-generated elemental images of the conventional case (top) and the curved lens array case (bottom).

Download Full Size | PPT Slide | PDF

The main feature of the proposed system attributing to the enhancement of the viewing angle is that all elemental lenses have corresponding elemental images in horizontal direction if we assume that the object is located in the vicinity of the curvature center of the lens array. In the conventional flat lens array case, some elemental lenses, which are located near the central axis of the object, have whole elemental images around the center of the corresponding elemental image region as shown in Fig. 2(b). As elemental lens gets farther from the central axis, its elemental image becomes more shifted with respect to the center of the corresponding elemental image region. When a portion of elemental image crosses the neighboring elemental image region, it is removed to avoid flipping and only the rest portion of the elemental image that remains in its elemental image region is displayed. As a result, many elemental lenses except a few elemental lenses near the center axis have no elemental images as we can see in Fig. 1 and Fig. 2(b) (top figure). In the curved lens array case, however, due to the curved structure of the lens array and the novel definition of the elemental image region, all elemental images are located around the center of the corresponding elemental image regions without any shift in horizontal direction. Thus there are as many elemental images as the number of elemental lenses in horizontal direction, the direction along which the lens array is curved. Compare the two sets of elemental images in Fig 2(b). The number of elemental images can be further increased if we use a lens array with smaller lenses (i.e., larger number of lenses). For the conventional method, only elemental lenses in 6 columns have elemental images, and among them only 2 columns of elemental lenses have complete elemental images in horizontal direction. However, for the proposed method, all elemental lenses have complete elemental images in horizontal direction. Since all elemental lenses have corresponding elemental images in the curved case, the observer can see the image up to the angle that is covered by the boundary elemental lenses. Therefore the horizontal viewing angle is widened remarkably.

 figure: Fig. 3.

Fig. 3. Enhanced viewing angle of the curved lens array system.

Download Full Size | PPT Slide | PDF

In the curved lens array system, however, the observer can see not only the integrated image but also the flipped images at the same time. The observer-2 in Fig. 3 shows this situation in detail. The flipped images are observed when there are elemental images observed through neighboring lenses. We should prevent the elemental image from being observed by neighboring elemental lens. Thus we set up barriers vertically between the curved lens array and the display panel along the lines that connect the edges of each elemental lens and the borders of corresponding elemental image region. Figure 4 shows the proposed scheme with barriers. The barriers eliminate the flipped images and do not affect the viewing angle in the curved lens array system. The observer can still observe the image over the range that the elemental lenses cover even with the barriers.

From the standpoint that the observer can see the image continuously, the proposed system might have no limitation in viewing angle if we use more elemental lenses. However there is another condition which limits the viewing angle in the curved lens array system. Since the lens array is curved, the gap, the distance from elemental image to the corresponding elemental lens, increases as the elemental lens becomes farther from the central axis. Therefore beyond a certain angle, the elemental images observed through corresponding lenses are not focused at the same image plane by the lens law and are not integrated. We suggest to name this phenomenon as gap mismatch. Thus the viewing angle depends on the tolerable gap mismatch with which elemental images can be integrated. As a result the gap mismatch limits the viewing angle in the curved lens array system. As the ratio of transverse image size to the radius of curvature of the lens array increases, the gap mismatch becomes serious at the boundaries and the viewing angle becomes limited. The gap mismatch might also cause some image distortion at the image boundaries when 3D images with some longitudinal depth are displayed because the deviation from the image plane (plane in focus) becomes larger. The gap mismatch might be removed if a curved screen is used instead of the flat panel display and if the elemental images are projected on the curved screen and integrated through the curved lens array.

 figure: Fig. 4.

Fig. 4. Elimination of the flipped images with barriers in curved lens array system.

Download Full Size | PPT Slide | PDF

For the nth elemental lens from the central elemental lens (we assign n=0 for the central elemental lens), the gap mismatch Gn can be calculated as follows (see Fig. 3):

Gn=(g+d)[sec(2nθ)1],

where g is the distance from the central elemental lens to the display panel, d is the distance from the lens array to the object (i.e. curvature radius), and θ is given by

θ=arctan(φ2d),

where φ is the pitch of each elemental lens. For observing the integrated image, observers should be within the angle that the gap mismatch is tolerable. If we can decide the reasonable maximum value of Gn, the maximum integer value nmax can also be derived by Eq. (1), from which the viewing angle can be induced with the consideration of gap mismatch. The viewing angle Ω is derived as the following:

Ω=nmax×2arctan(φ2d).

3. Experimental results

In experiments, a Fresnel lens array is used as the lens system. It consists of 13 by 13 square elemental Fresnel lenses which have a width of 10 mm and a focal length of 22 mm. The radius of curvature of curved lens array is 10 cm. Figure 5 shows the curved lens array used in the experiment. The pixel pitch of the liquid crystal display (LCD) panel is 0.31 mm in horizontal direction and 0.32 mm in vertical direction. As observation angle increases, the corresponding elemental images become farther from the corresponding lenses and the illumination of elemental images is not uniform along the horizontal direction. However, the effect is not so serious because LCD panel is bright enough to integrate the side elemental images within the viewing angle.

 figure: Fig. 5.

Fig. 5. Experimental setup without the barriers; the radius of curvature of the curved lens array is 10 cm.

Download Full Size | PPT Slide | PDF

The location of the integrated image is 8 cm in front of the lens array. The barriers are rectangular plates with the height of 14 cm and each width is the length from the edge of each elemental lens to the border of the corresponding elemental image region in display panel. Figure 6 shows the experimental results without (Fig. 6(a)) and with (Fig. 6(b)) the barriers. We can see that both the integrated image and the flipped image appear when the barriers are not set up, and the flipped image is efficiently removed with the barrier.

 figure: Fig. 6.

Fig. 6. Images integrated by the proposed method (a) without barriers (b) with barriers. The integrated images are observed from the center

Download Full Size | PPT Slide | PDF

To investigate the effect of widening viewing angle, we compare the integrated image of the conventional scheme with that of the proposed scheme. Figure 7 shows integrated images observed from different viewing directions in the conventional and the proposed schemes. The observed image of right viewing region which is not shown in Fig. 7. is similar to that of left result because the viewing region is symmetric. In the conventional scheme the theoretical viewing angle for the lens specification is 9.4° [15] along one side and the effective angle is 7° (one-side) experimentally. From 7°, some portions of the apple image disappear. We can see that at left 10° the original image disappears and the flipped image appears. As the observation angle increases more, the flipped images appear more as shown in conventional scheme of Fig. 7 and around 15° we can observe the flipped image only. In the proposed scheme, however, we can see the integrated image even beyond the conventional viewing angle as shown in the proposed scheme in Fig. 7. With increasing observation angle, however, the images are not integrated well due to the gap mismatch as shown in left 20° of the proposed scheme in Fig. 7. Figure 8 shows that the effective viewing angle is 16.5° (one-side) experimentally. As a result, the viewing angle of the proposed system is about more than two times wider than that of the conventional system. Figure 9 displays a movie of the observed integrated images when the observer is moving from side to side of the viewing zone in the proposed scheme. The location of the integrated image of an apple is 7 cm and the location of the integrated image of an orange is 9 cm in front of the lens array. We can see the different perspectives of the images continuously with different viewing directions. In this movie the grids of the lens array are observed because the depth of focus of the camera used to pick up the integrated image is larger than that of human. Although the existence of the grids is a demerit of the integral imaging, when human observes the integrated image, the grids behind the image are not observed so well. The camera movement may not look smooth. However, it is not owing to the integrated images but owing to the lack of technique to videotape the image as moving the camera uniformly in horizontal direction.

The results show a remarkable enhancement of viewing angle in InIm. Since the lens array is curved horizontally, only the horizontal viewing angle is enhanced in this experiment. Vertical viewing angle can be enhanced similarly if we use a lens array that is also curved vertically. If spherically shaped lens array is used, the viewing angle would be enhanced in both directions.

 figure: Fig. 7.

Fig. 7. Integrated images from different view points (a) by the conventional method (b) by the proposed method with barriers.

Download Full Size | PPT Slide | PDF

 figure: Fig. 8.

Fig. 8. Integrated images by the proposed method from (a) left 16.5° (b) center (c) right 16.5°.

Download Full Size | PPT Slide | PDF

 figure: Fig. 9.

Fig. 9. Movie of integrated images observed from different viewing directions [1.85MB].

Download Full Size | PPT Slide | PDF

4. Conclusion

In conclusion, a method to enhance the viewing angle in InIm has been proposed and demonstrated by experiments. By use of a curved lens array and the elemental images that are generated by CGII according to the curvature of the lens array, the viewing angle is expanded remarkably. The image flipping is eliminated by use of the barriers. Every elemental lens has a corresponding elemental image centered at the corresponding elemental region, which is a unique characteristic compared with the conventional flat lens array scheme. If we can use a flexible display panel which can keep the gap constant or if we use a lens array of which focal length is tunable, we expect to overcome the gap mismatch and these methods will be able to realize 3D display system which is quite free from limitation of viewing angle.

Acknowledgments

This work was supported by the Next-Generation Information Display R&D Center, one of the 21st Century Frontier R & D Programs funded by the Ministry of Science and Technology of Korea.

References and links

1. G. Lippmann, “La photographie integrale,” Comptes-Rendus Acad. Sci. 146, 446–451 (1908).

2. N. Davies, M. McCormick, and M. Brewin, “Design and analysis of an image transfer system using microlens arrays,” Opt. Eng. 33, 3624–3633 (1994). [CrossRef]  

3. M. McCormick and N. Davies, “Full natural colour 3D optical models by integral imaging” Fourth International Conference on Holographic Systems, Components and Applications (Neuchatel, Switzerland, 1993), 237–242.

4. S. Manolache, A. Aggoun, M. McCormick, N. Davies, and S.-Y. Kung, “Analytical model of a three-dimensional integral image recording system that uses circular- and hexagonal-based spherical surface micro lenses.” J. Opt. Soc. Am. A 18, 1814–1821 (2001). [CrossRef]  

5. M. C. Forman, N. Davies, and M. McCormick, “Continuous parallax in discrete pixelated integral three-dimensional displays,” J. Opt. Soc. Am. A 20, 411–420 (2003). [CrossRef]  

6. J. Arai, F. Okano, H. Isono, and I. Yuyama, “Gradient-index lens array method based on real time integral photography for three-dimensional images,” Appl. Opt. 37, 2034–2045 (1998). [CrossRef]  

7. S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Three-dimensional display system based on computer-generated integral photography,” in Stereoscopic Displays and Virtual Reality Systems VIII, A. J. Woods, J. O. Merritt, and S. A. Benton, eds., Proc. SPIE 4297, 187–195 (2001). [CrossRef]  

8. S.-W Min, S. Jung, J.-H. Park, and B. Lee, “Study for wide-viewing integral photography using an aspheric Fresnel-lens array,” Opt. Eng. 41, 2572–2576 (2002). [CrossRef]  

9. B. Lee, S. Jung, and J.-H. Park, “Viewing-angle-enhanced integral imaging using lens switching,” Opt. Lett. 27, 818–820 (2002). [CrossRef]  

10. S. Jung, J.-H. Park, H. Choi, and B. Lee, “Wide-viewing integral three-dimensional imaging by use of orthogonal polarization switching,” Appl. Opt. 42, 2513–2520 (2003). [CrossRef]   [PubMed]  

11. S.-H. Shin and B. Javidi, “Viewing-angle enhancement of specke-reduced volume holographic three-dimensional display by use of integral imaging,” Appl. Opt. 41, 5562–5567 (2002). [CrossRef]   [PubMed]  

12. J.-S. Jang and B. Javidi, “Improvement of viewing angle in integral imaging by use of moving lenslet arrays with low fill factor,” Appl. Opt. 42, 1996–2002 (2003). [CrossRef]   [PubMed]  

13. S.-W. Min, B. Javidi, and B. Lee, “Enhanced three-dimensional integral imaging system by use of double display devices,” Appl. Opt. 42, 4186–4195 (2003). [CrossRef]   [PubMed]  

14. B. Javidi, S.-W. Min, and B. Lee, “Enhanced 3D color integral imaging using multiple display devices,” Proc. of IEEE LEOS Annual Meeting (San Diego, CA, 2001), 491–492.

15. J.-H. Park, S.-W. Min, S. Jung, and B. Lee, “Analysis of viewing parameters for two display methods based on integral photography,” Appl. Opt. 40, 5217–5232 (2001). [CrossRef]  

References

  • View by:
  • |
  • |
  • |

  1. G. Lippmann, “La photographie integrale,” Comptes-Rendus Acad. Sci. 146, 446–451 (1908).
  2. N. Davies, M. McCormick, and M. Brewin, “Design and analysis of an image transfer system using microlens arrays,” Opt. Eng. 33, 3624–3633 (1994).
    [Crossref]
  3. M. McCormick and N. Davies, “Full natural colour 3D optical models by integral imaging” Fourth International Conference on Holographic Systems, Components and Applications (Neuchatel, Switzerland, 1993), 237–242.
  4. S. Manolache, A. Aggoun, M. McCormick, N. Davies, and S.-Y. Kung, “Analytical model of a three-dimensional integral image recording system that uses circular- and hexagonal-based spherical surface micro lenses.” J. Opt. Soc. Am. A 18, 1814–1821 (2001).
    [Crossref]
  5. M. C. Forman, N. Davies, and M. McCormick, “Continuous parallax in discrete pixelated integral three-dimensional displays,” J. Opt. Soc. Am. A 20, 411–420 (2003).
    [Crossref]
  6. J. Arai, F. Okano, H. Isono, and I. Yuyama, “Gradient-index lens array method based on real time integral photography for three-dimensional images,” Appl. Opt. 37, 2034–2045 (1998).
    [Crossref]
  7. S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Three-dimensional display system based on computer-generated integral photography,” in Stereoscopic Displays and Virtual Reality Systems VIII, A. J. Woods, J. O. Merritt, and S. A. Benton, eds., Proc. SPIE 4297, 187–195 (2001).
    [Crossref]
  8. S.-W Min, S. Jung, J.-H. Park, and B. Lee, “Study for wide-viewing integral photography using an aspheric Fresnel-lens array,” Opt. Eng. 41, 2572–2576 (2002).
    [Crossref]
  9. B. Lee, S. Jung, and J.-H. Park, “Viewing-angle-enhanced integral imaging using lens switching,” Opt. Lett. 27, 818–820 (2002).
    [Crossref]
  10. S. Jung, J.-H. Park, H. Choi, and B. Lee, “Wide-viewing integral three-dimensional imaging by use of orthogonal polarization switching,” Appl. Opt. 42, 2513–2520 (2003).
    [Crossref] [PubMed]
  11. S.-H. Shin and B. Javidi, “Viewing-angle enhancement of specke-reduced volume holographic three-dimensional display by use of integral imaging,” Appl. Opt. 41, 5562–5567 (2002).
    [Crossref] [PubMed]
  12. J.-S. Jang and B. Javidi, “Improvement of viewing angle in integral imaging by use of moving lenslet arrays with low fill factor,” Appl. Opt. 42, 1996–2002 (2003).
    [Crossref] [PubMed]
  13. S.-W. Min, B. Javidi, and B. Lee, “Enhanced three-dimensional integral imaging system by use of double display devices,” Appl. Opt. 42, 4186–4195 (2003).
    [Crossref] [PubMed]
  14. B. Javidi, S.-W. Min, and B. Lee, “Enhanced 3D color integral imaging using multiple display devices,” Proc. of IEEE LEOS Annual Meeting (San Diego, CA, 2001), 491–492.
  15. J.-H. Park, S.-W. Min, S. Jung, and B. Lee, “Analysis of viewing parameters for two display methods based on integral photography,” Appl. Opt. 40, 5217–5232 (2001).
    [Crossref]

2003 (4)

2002 (3)

2001 (3)

1998 (1)

1994 (1)

N. Davies, M. McCormick, and M. Brewin, “Design and analysis of an image transfer system using microlens arrays,” Opt. Eng. 33, 3624–3633 (1994).
[Crossref]

1908 (1)

G. Lippmann, “La photographie integrale,” Comptes-Rendus Acad. Sci. 146, 446–451 (1908).

Aggoun, A.

Arai, J.

Brewin, M.

N. Davies, M. McCormick, and M. Brewin, “Design and analysis of an image transfer system using microlens arrays,” Opt. Eng. 33, 3624–3633 (1994).
[Crossref]

Choi, H.

Davies, N.

M. C. Forman, N. Davies, and M. McCormick, “Continuous parallax in discrete pixelated integral three-dimensional displays,” J. Opt. Soc. Am. A 20, 411–420 (2003).
[Crossref]

S. Manolache, A. Aggoun, M. McCormick, N. Davies, and S.-Y. Kung, “Analytical model of a three-dimensional integral image recording system that uses circular- and hexagonal-based spherical surface micro lenses.” J. Opt. Soc. Am. A 18, 1814–1821 (2001).
[Crossref]

N. Davies, M. McCormick, and M. Brewin, “Design and analysis of an image transfer system using microlens arrays,” Opt. Eng. 33, 3624–3633 (1994).
[Crossref]

M. McCormick and N. Davies, “Full natural colour 3D optical models by integral imaging” Fourth International Conference on Holographic Systems, Components and Applications (Neuchatel, Switzerland, 1993), 237–242.

Forman, M. C.

Isono, H.

Jang, J.-S.

Javidi, B.

Jung, S.

S. Jung, J.-H. Park, H. Choi, and B. Lee, “Wide-viewing integral three-dimensional imaging by use of orthogonal polarization switching,” Appl. Opt. 42, 2513–2520 (2003).
[Crossref] [PubMed]

B. Lee, S. Jung, and J.-H. Park, “Viewing-angle-enhanced integral imaging using lens switching,” Opt. Lett. 27, 818–820 (2002).
[Crossref]

S.-W Min, S. Jung, J.-H. Park, and B. Lee, “Study for wide-viewing integral photography using an aspheric Fresnel-lens array,” Opt. Eng. 41, 2572–2576 (2002).
[Crossref]

S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Three-dimensional display system based on computer-generated integral photography,” in Stereoscopic Displays and Virtual Reality Systems VIII, A. J. Woods, J. O. Merritt, and S. A. Benton, eds., Proc. SPIE 4297, 187–195 (2001).
[Crossref]

J.-H. Park, S.-W. Min, S. Jung, and B. Lee, “Analysis of viewing parameters for two display methods based on integral photography,” Appl. Opt. 40, 5217–5232 (2001).
[Crossref]

Kung, S.-Y.

Lee, B.

S. Jung, J.-H. Park, H. Choi, and B. Lee, “Wide-viewing integral three-dimensional imaging by use of orthogonal polarization switching,” Appl. Opt. 42, 2513–2520 (2003).
[Crossref] [PubMed]

S.-W. Min, B. Javidi, and B. Lee, “Enhanced three-dimensional integral imaging system by use of double display devices,” Appl. Opt. 42, 4186–4195 (2003).
[Crossref] [PubMed]

S.-W Min, S. Jung, J.-H. Park, and B. Lee, “Study for wide-viewing integral photography using an aspheric Fresnel-lens array,” Opt. Eng. 41, 2572–2576 (2002).
[Crossref]

B. Lee, S. Jung, and J.-H. Park, “Viewing-angle-enhanced integral imaging using lens switching,” Opt. Lett. 27, 818–820 (2002).
[Crossref]

S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Three-dimensional display system based on computer-generated integral photography,” in Stereoscopic Displays and Virtual Reality Systems VIII, A. J. Woods, J. O. Merritt, and S. A. Benton, eds., Proc. SPIE 4297, 187–195 (2001).
[Crossref]

J.-H. Park, S.-W. Min, S. Jung, and B. Lee, “Analysis of viewing parameters for two display methods based on integral photography,” Appl. Opt. 40, 5217–5232 (2001).
[Crossref]

B. Javidi, S.-W. Min, and B. Lee, “Enhanced 3D color integral imaging using multiple display devices,” Proc. of IEEE LEOS Annual Meeting (San Diego, CA, 2001), 491–492.

Lippmann, G.

G. Lippmann, “La photographie integrale,” Comptes-Rendus Acad. Sci. 146, 446–451 (1908).

Manolache, S.

McCormick, M.

M. C. Forman, N. Davies, and M. McCormick, “Continuous parallax in discrete pixelated integral three-dimensional displays,” J. Opt. Soc. Am. A 20, 411–420 (2003).
[Crossref]

S. Manolache, A. Aggoun, M. McCormick, N. Davies, and S.-Y. Kung, “Analytical model of a three-dimensional integral image recording system that uses circular- and hexagonal-based spherical surface micro lenses.” J. Opt. Soc. Am. A 18, 1814–1821 (2001).
[Crossref]

N. Davies, M. McCormick, and M. Brewin, “Design and analysis of an image transfer system using microlens arrays,” Opt. Eng. 33, 3624–3633 (1994).
[Crossref]

M. McCormick and N. Davies, “Full natural colour 3D optical models by integral imaging” Fourth International Conference on Holographic Systems, Components and Applications (Neuchatel, Switzerland, 1993), 237–242.

Min, S.-W

S.-W Min, S. Jung, J.-H. Park, and B. Lee, “Study for wide-viewing integral photography using an aspheric Fresnel-lens array,” Opt. Eng. 41, 2572–2576 (2002).
[Crossref]

Min, S.-W.

S.-W. Min, B. Javidi, and B. Lee, “Enhanced three-dimensional integral imaging system by use of double display devices,” Appl. Opt. 42, 4186–4195 (2003).
[Crossref] [PubMed]

S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Three-dimensional display system based on computer-generated integral photography,” in Stereoscopic Displays and Virtual Reality Systems VIII, A. J. Woods, J. O. Merritt, and S. A. Benton, eds., Proc. SPIE 4297, 187–195 (2001).
[Crossref]

J.-H. Park, S.-W. Min, S. Jung, and B. Lee, “Analysis of viewing parameters for two display methods based on integral photography,” Appl. Opt. 40, 5217–5232 (2001).
[Crossref]

B. Javidi, S.-W. Min, and B. Lee, “Enhanced 3D color integral imaging using multiple display devices,” Proc. of IEEE LEOS Annual Meeting (San Diego, CA, 2001), 491–492.

Okano, F.

Park, J.-H.

S. Jung, J.-H. Park, H. Choi, and B. Lee, “Wide-viewing integral three-dimensional imaging by use of orthogonal polarization switching,” Appl. Opt. 42, 2513–2520 (2003).
[Crossref] [PubMed]

B. Lee, S. Jung, and J.-H. Park, “Viewing-angle-enhanced integral imaging using lens switching,” Opt. Lett. 27, 818–820 (2002).
[Crossref]

S.-W Min, S. Jung, J.-H. Park, and B. Lee, “Study for wide-viewing integral photography using an aspheric Fresnel-lens array,” Opt. Eng. 41, 2572–2576 (2002).
[Crossref]

S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Three-dimensional display system based on computer-generated integral photography,” in Stereoscopic Displays and Virtual Reality Systems VIII, A. J. Woods, J. O. Merritt, and S. A. Benton, eds., Proc. SPIE 4297, 187–195 (2001).
[Crossref]

J.-H. Park, S.-W. Min, S. Jung, and B. Lee, “Analysis of viewing parameters for two display methods based on integral photography,” Appl. Opt. 40, 5217–5232 (2001).
[Crossref]

Shin, S.-H.

Yuyama, I.

Appl. Opt. (6)

Comptes-Rendus Acad. Sci. (1)

G. Lippmann, “La photographie integrale,” Comptes-Rendus Acad. Sci. 146, 446–451 (1908).

J. Opt. Soc. Am. A (2)

Opt. Eng. (2)

N. Davies, M. McCormick, and M. Brewin, “Design and analysis of an image transfer system using microlens arrays,” Opt. Eng. 33, 3624–3633 (1994).
[Crossref]

S.-W Min, S. Jung, J.-H. Park, and B. Lee, “Study for wide-viewing integral photography using an aspheric Fresnel-lens array,” Opt. Eng. 41, 2572–2576 (2002).
[Crossref]

Opt. Lett. (1)

Proc. SPIE (1)

S.-W. Min, S. Jung, J.-H. Park, and B. Lee, “Three-dimensional display system based on computer-generated integral photography,” in Stereoscopic Displays and Virtual Reality Systems VIII, A. J. Woods, J. O. Merritt, and S. A. Benton, eds., Proc. SPIE 4297, 187–195 (2001).
[Crossref]

Other (2)

M. McCormick and N. Davies, “Full natural colour 3D optical models by integral imaging” Fourth International Conference on Holographic Systems, Components and Applications (Neuchatel, Switzerland, 1993), 237–242.

B. Javidi, S.-W. Min, and B. Lee, “Enhanced 3D color integral imaging using multiple display devices,” Proc. of IEEE LEOS Annual Meeting (San Diego, CA, 2001), 491–492.

Supplementary Material (1)

» Media 1: MPG (1279 KB)     

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1.
Fig. 1. Limitation of viewing angle: observer-1 is in viewing zone and observer-2 is out of the viewing zone.
Fig. 2.
Fig. 2. (a) Configuration of the curved lens array system (b) computer-generated elemental images of the conventional case (top) and the curved lens array case (bottom).
Fig. 3.
Fig. 3. Enhanced viewing angle of the curved lens array system.
Fig. 4.
Fig. 4. Elimination of the flipped images with barriers in curved lens array system.
Fig. 5.
Fig. 5. Experimental setup without the barriers; the radius of curvature of the curved lens array is 10 cm.
Fig. 6.
Fig. 6. Images integrated by the proposed method (a) without barriers (b) with barriers. The integrated images are observed from the center
Fig. 7.
Fig. 7. Integrated images from different view points (a) by the conventional method (b) by the proposed method with barriers.
Fig. 8.
Fig. 8. Integrated images by the proposed method from (a) left 16.5° (b) center (c) right 16.5°.
Fig. 9.
Fig. 9. Movie of integrated images observed from different viewing directions [1.85MB].

Equations (3)

Equations on this page are rendered with MathJax. Learn more.

G n = ( g + d ) [ sec ( 2 n θ ) 1 ] ,
θ = arctan ( φ 2 d ) ,
Ω = n max × 2 arctan ( φ 2 d ) .

Metrics