Abstract

The interocular affine similarity of three-dimensional scenes is investigated and a novel accelerated reconfiguration algorithm for intermediate-view polygon computer-generated holograms based on interocular affine similarity is proposed. We demonstrate by using the numerical simulations of full-color polygon computer-generation holograms that the proposed intermediate view reconfiguration algorithm is particularly useful for the computation of wide-viewing angle polygon computer-generated holograms.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Three-dimensional (3D) imaging and display technologies have been in active development over the past two decades. The basic principle of 3D display technologies [1] is the utilization of binocular 3D cues for the human visual perception system, with interocular disparity being the most effective of these cues. In a classical sense, the interocular disparity supposes that the parallax views of a 3D scene are considered completely different.

In general, holographic 3D displays are considered the ultimate form of 3D display because they are able to deliver the most natural 3D images with accommodation-vergence match [2]. This accommodation-vergence match is ascribed to the interocular disparity included in the CGH pattern. The computer-generated holograms (CGHs) for holographic 3D displays contain all of the information on the continuous parallax views of a three-dimensional (3D) scene, which is recorded in a CGH using single two-dimensional continuous complex fringe patterns and produces motion parallax effect as well as accommodation-vergence match.

Representation theory and rapid calculation algorithms have been two of the main CGH research issues. Various CGH representation theories have been developed, such as point clouds [3–8], ray-sampling [9–11], depth-map [12–14], and polygon [15–21] based CGH models. The polygon CGH is well known by computational efficiency, rigorousness of modeling and flexibility. The polygon CGH can be efficiently calculated by using fast Fourier transform (FFT) method, but the analytic theory of polygon CGHs has steadily continued [22,23]. The development of the fast algorithms has focused on parallel implementation on parallel computing hardware and related algorithm development [23–28]. The reduction in the complexity of CGH algorithms through mathematical analysis [29–31] from an information theoretic perspective is fundamentally important, but it is relatively rare in comparison with research on parallel computing.

From an information theory perspective of CGH, we need to introduce the concept contrast to the interocular disparity, interocular similarity, where the different directional views of a 3D scene share a strong similarity. Interocular similarity is worth analyzing in depth since it gives new insight on the information of CGH and its utilization enables the acceleration of CGH. If we can exploit the interocular similarity of a 3D scene with finite viewing angle to synthesize intermediate view CGHs and its total calculation amount is reduced, it would be expected that we have a mathematical complexity reduction for the acceleration of CGH calculation. In this context, interocular similarity leads to the expectation that continuous parallax views share informational similarity and the actual informational capacity of a CGH can be smaller than the informational capacity of the conventional space-bandwidth product [32, 33]. With this in mind, we can extend this to understand that the space-bandwidth product is the upper-bound of the amount of information that is containable in a finite-viewing-angle 3D image, because the conventional space-bandwidth product assumes that there is no relationship between adjacent views.

This fundamental information theoretic perspective on CGH is the motivation of this paper with the primary questions being how we can efficiently use the interocular similarity of 3D objects to develop an accelerated algorithm for CGH synthesis and how interocular similarity can be represented efficiently. This paper presents a theoretical analysis of the interocular similarity among adjacent holographic images with angular separation. The interocular similarity between adjacent views can be represented by the affine transform of corresponding points and this property is extensively investigated and extended to efficiently synthesize wide-view polygon CGHs. An application of the proposed method to 360-degree multi-view CGH content generation [34–38] is presented.

This paper is structured as follows. In Section 2, a geometric model of 3D scene perception is described. In Section 3, the affine transform analysis of the interocular similarity of a 3D scene is presented. In Section 4, an accelerated CGH algorithm based on the interocular similarity is proposed based on the wave optic interpretation with affine transformation for CGH calculation. Numerical experiments and the subsequent evaluation of the proposed accelerated CGH algorithm are presented with an example of 360-degree multi-view CGH content generation. Finally, concluding remarks are provided in Section 5.

2. Geometric model of three-dimensional scene perception

In this section, we present a geometric model of 3D scene perception and analyze the interocular similarity of a 3D scene. The focus of the analysis is the non-linear relationship between two different parallax views in retina spaces derived from the 3D scene perception model. The non-linear relationship can be linearly approximated by an affine transformation even for quite large angular separation between two views, a process referred to the interocular affine similarity transform. The tolerance range for the interocular affine similarity is numerically analyzed using the interocular affine similarity transform. The interocular affine similarity established in this section is then applied to the accelerated CGH synthesis algorithm in Section 3.

A basic property of the visual perception system is that the monocular imaging system of the eye allows the viewer to see 3D objects by automatically adjusting its accommodation to a convergence point. In Fig. 1, two monocular imaging systems that share a convergence point are illustrated, with the global reference coordinate system and the local coordinate systems for the left and right eyes denoted asxyz, u1v1w1, and u2v2w2, respectively. When both eyes are focused on the convergence point, then the foci of two eyes are automatically adjusted to the convergence point. The perceived image in the eye varies with changes in eye position. Here, we develop a geometric model of this monocular imaging based on an arbitrary location and rotation.

 figure: Fig. 1

Fig. 1 Convergence and accommodation in the binocular visual perception system (a) global and local coordinates and (b) adaptive global coordinate system.

Download Full Size | PPT Slide | PDF

Let us set the convergence point as Pc=(xc,yc,zc) and the projection center of the eye as N=(x0,y0,z0) in the global coordinate system, where the projection center N is the center of the eye lens. In normal conditions, the unit vector u is on the viewing plane, which is the plane specified by the u and w vectors, and the unit vector v is normal to the viewing plane. The optic axis vector of the eye in the global coordinate system is given by w=(cosϕsinθ,sinϕsinθ,cosθ). The coordinates of the projection center N is then solved by

N(x0y0z0)=(xcyczc)+(cosϕsinθsinϕsinθcosθ)t,
where the necessary parameters, t, θ, and θ are given by
t=(x0xc)2+(y0yc)2+(z0zc)2,
(cosθ,sinθ)=((z0zc)/t,(x0xc)2+(y0yc)2/t),
(cosϕ,sinϕ)=((x0xc)/(tsinθ),(y0yc)/(tsinθ)).
The v-axis unit vector of the retina coordinate system v is obtained by
v=(cosϕcosθsinτsinϕcosτ,sinϕcosθsinτ+cosϕcosτ,sinθsinτ),
where τ is the tilt angle of the eye that is adjusted to make u parallel to the xy plane. Because the eye’s focus remains on the convergence point, when the convergence point moves, the focal length of the eye varies in accordance with
f=det/(de+t),
where de is the distance between the eye lens and the retina plane, and t is the distance between the convergence point and the eye plane form Eq. (2). In addition, in order to consistently describe the wave optic imaging and CGH synthesis theory using the same framework, we need to define an adaptive global coordinate system for the eye, as seen in Fig. 1(b). In the adaptive global coordinate system of the eye xyz, the 3D scene is rotated relatively to align the optic axis of the eye with the global coordinate z-axis. The optic axis z is matched to the optic axis w and the plane xy is parallel to the plane uv. As a result, the adaptive global coordinate systems are obtained, respectively, as
(xyz)=(r11r12r13r21r22r23r31r32r33)(xx0yy0zz0)+(x0y0z0),
where the rotational matrix and the projection center, N, in the virtual global coordinate system, are set as
(r11r12r13r21r22r23r31r32r33)=(cosτsinτ0sinτcosτ0001)(cosθ0sinθ010sinθ0cosθ)(cosϕsinϕ0sinϕcosϕ0001),
(x0,y0,z0)=(0,0,x02+y02+z02).
The imaging of a 3D object in the retina space of the observer’s eye is interpreted by the adaptive coordinate system, which is schematically illustrated in Fig. 2.

 figure: Fig. 2

Fig. 2 Image transport of an eye system (a) collinear transform and (b) rotational transform in object space and retina space.

Download Full Size | PPT Slide | PDF

For simplicity, from this point forward, the notation xyz will be used instead of xyz to represent the adaptive global coordinate system. Consider the imaging of a 3D object through a single eye illustrated in Fig. 2, where the triangular facet in object space is imaged in the retina space of the viewer’s eye. The center of mass of the triangular facet is denoted by (xc,yc,zc) and the eye focus is set to the center of mass. A triangular facet in the object space is delivered to the retinal space through the geometric imaging transformation [16]. The focal length of the eye lens f is set by

1/f=1/(d1zc)+1/d2.
Set D1=d1zc and then D2=1/(1/f1/D1). The corresponding imaging point in retina space (u,v,w) is obtained by

(u,v,w)=(D2x/D1,D2y/D1,D2d2).

The geometric imaging transformation transports the triangular facet with three apex points, P1(x1,y1,z1), P2(x2,y2,z2), and P3(x3,y3,z3), to the triangular facet with Pr1(u1,v1,w1), Pr2(u2,v2,w2), and Pr3(u3,v3,w3) in retina space. It is assumed here that all the points on the triangular facet in object space P1P2P3 are mapped onto the flat triangular facet in retina space P1P2P3. Here, a geometric imaging transform between the textures on P1P2P3 and Pr1Pr2Pr3 is developed. The triangles P1P2P3 and Pr1Pr2Pr3 specify two planes in object and retina space, respectively, as

cosϕsinθ(xxc)+sinϕsinθ(yyc)+cosθ(zzc)=0,
cosϕrsinθr(uuc)+sinϕrsinθr(vvc)+cosθr(wwc)=0,
where θ and ϕ are the longitudinal and azimuthal angles of the object space local coordinate system and θr and ϕr are the longitudinal and azimuthal angle of the retina space local coordinate system. (xc,yc,zc) and (uc,vc,wc) are the centers of mass of the triangles P1P2P3 and Pr1Pr2Pr3. The two corresponding points (x,y,z) and (u,v,w) are connected by the collinear condition with the projection center (0,0,d1), which is expressed by
(xyz)=(0u0vd1(d1+d2+w))t+(uvd1+d2+w),
where the parameter t is obtained by substituting Eq. (14) into Eq. (12):
t=(xcu)cosϕsinθ+(ycv)sinϕsinθ+(zcwd1d2)cosθxrcosϕsinθyrsinϕsinθ+[d1(d1+d2+zr)]cosθ.
In a similar manner, (u,v,w) is solved as
(uvw)=1(1t)(xyzd1)+(00d2)=s(xyzd1)+(00d2),
where the parameter s is obtained by substituting Eq. (16) into Eq. (13):
s=uccosϕrsinθr+vcsinϕrsinθr+(d2+wc)cosθrcosϕrsinθrx+sinϕrsinθry+(zd1)cosθr.
Each triangular facet has its own local coordinate system with the origin set to the center of mass denoted by xyz and uvw in Fig. 2(b) [15]. The local coordinates of a point in object space is solved for the global coordinates
(x'y'z')=(cosθcosϕcosθsinϕsinθsinϕcosϕ0sinθcosϕsinθsinϕcosθ)(xxcyyczzc).
The local coordinate of the corresponding point in the retina space is given by
(u'v'w')=(cosθrcosϕrcosθrsinϕrsinθrsinϕrcosϕr0sinθrcosϕrsinθrsinϕrcosθr)(uucvvcwwc).
Here, (x,y,z) and (u,v,w) have the corresponding local coordinates (x,y) and (u,v), respectively. (xc,yc,zc) and (uc,vc,wc) are the centers of mass of the triangles P1P2P3 and Pr1Pr2Pr3, respectively. The local coordinates of the object space is functionally related to those of retina space in the form of x=x(u,v) and y=y(u,v) or, inversely, u=u(x,y) and v=v(x,y). (x,y,0) is solved for(u,v,0) as
(x'y'z')=(cosθcosϕcosθsinϕsinθsinϕcosϕ0sinθcosϕsinθsinϕcosθ)[(0u0vd1(d1+d2+w))t+(uxcvycd1+d2+wzc)],
where t is given by Eq. (15) and (u,v,w) is represented by
(uvw)=(cosϕrcosθrsinθrcosϕrsinθrsinϕrcosθrcosϕrsinϕrsinθrsinθr0cosθr)(uvw)+(ucvcwc).
Inversely, (u,v,0) is solved for (x,y,0) so that
(uvw)=(cosθrcosϕrcosθrsinϕrsinθrsinϕrcosϕr0sinθrcosϕrsinθrsinϕrcosθr)(uucvvcwwc),
where s is Eq. (17) and (u,v,w) is represented by
(uvw)=s{(cosθrcosϕrcosθrsinϕrsinθrsinϕrcosϕr0sinθrcosϕrsinθrsinϕrcosθr)1(x'y'z')+(xcyczc)}+(00d2sd1).
From Eqs. (20), (21), (22), and (23), we have the set of mapping functions relating the local coordinates of object space to those of retina space as x=x(u,v) and y=y(u,v) or, inversely, u=u(x,y) and v=v(x,y). As a result, non-linear mapping is established between the local coordinate system of object space (x,y,0) and retina space (u,v,0).

Figure 3(a) depicts the simulation setup used to verify the visual perception of a 3D object with a single rectangle background plane and a triangular facet positioned slightly apart from the rectangle plane. At the same time, the left and right eyes observe this scene.

 figure: Fig. 3

Fig. 3 (a) Schematic design of the computational simulation to verify the non-linear conversion relationship between two eyes, and (b) the retina image and non-linear grid map for the left and right eyes.

Download Full Size | PPT Slide | PDF

By using nonlinear mapping u=u(x,y) and v=v(x,y), we can draw a non-linear grid on the retina plane that is mapped from the uniform grid of the object surface. Figure 3 presents the mappings of a uniform grid drawn on the rectangular facet into the retina planes of the two separated eyes and the observed images with different parallax for the two distant positions of the eyes. This perception process can be visualized by the mapping of uniform grid in object local space to non-uniform grid in retina local space, in which the uniform grid image is stretched asymmetrically on the retina imaging plane and its shape changes for different positions. The simulation in Fig. 3 illustrates two processes: (1) how the triangle looks on the image plane of each camera and (2) how a uniform grid on local coordinates of the triangular facet floating in global object space is mapped to the local coordinates of the imaged triangular facets for both eyes. The first row of the chart in Fig. 3(b) shows that the perspectives of both eyes differ for the same scene. In the second row of the chart, we can see that the uniform grid on the local coordinates of object space is non-linearly mapped to that of each retina space. The two non-linear grids also exhibit different patterns because the location and view direction of both eyes are inconsistent.

It is important to consider the coordinate transform of a point in the local coordinate system of a facet to the local coordinate system of the adaptive global coordinate system. This relationship is described by

(xnynzn)=GtoLnRLtoG(x'y'z').
The full derivation of Eq. (24) with the definitions of GtoLn, R, and LtoG, are described in the Appendix. From Eq. (24), the redefined grid (xn,yn,z'n)on the adaptive local coordinate is solved for the uniform grid (x,y,z') of the original local coordinates.

3. Interocular affine similarity of a three-dimensional scene

If a triangular facet has a plain texture, the observer will notice variation in its shape and shading in response to spatial changes in the observer’s location and view direction. However, for a textured triangular facet, the observer can perceive not only changes in shape and shading but also deformation of the texture pattern. As depicted in Fig. 4, an observer located at position A, which is close to the normal vector of the triangular facet, can see a mostly undistorted texture pattern. However, another observer located at position C perceives the highly distorted texture pattern because location C is far from the normal axis of the triangular facet.

 figure: Fig. 4

Fig. 4 (a) Disparity in observed texture patterns at different locations. (b) The approximately linear relationship among the adjacent observation points.

Download Full Size | PPT Slide | PDF

How an arbitrary texture pattern on a triangular facet floating in object space is distorted in the imaged triangular facet of retina space is fully explained by the geometric mapping model developed in Section 2. According to the visual perception model, the shape and texture pattern of the 3D object vary with changes in position, however, the observer can be thought to perceive similar scenes when the observation location or view direction does not change dramatically, meaning that interocular similarity exists between weakly separated observation points.

In the context of a holographic 3D display, the observed images at both observation points share the same holographic information through the similarity. From this point of view, we suppose that the holographic image observed at the original point has a linearly approximate conversion relationship with the other image observed at a neighboring point. Using the supposed linear relationship between the two images, it is possible to approximate the holographic information at the neighboring point by reconfiguring the information for the original point [29]. We employ the affine transformation to represent the approximate linear relationship among adjacent observation points. This strategy is illustrated in Fig. 4(b). From a practical point of view, it is expected that it can be used to reduce the computational complexity of the CGH algorithm so that the computation speed of polygon CGHs can be dramatically increased.

Here, let us develop the mathematical formulation of this strategy based on the geometric mapping transformation of the previous section, (x(u,v),y'(u,v),0) and (u(x,y),v(x,y),0). Firstly, we set up the referential retina space and its adjacent retina space which are denoted by, (u'ref,v'ref) and (u'adj,v'adj), respectively, as shown in Fig. 4(b). Although the two triangles on the local coordinates of the referential and adjacent retina spaces would appear to have different shapes, their relationship is described by the affine similarity transformation. After determining the three apexes of the triangular facets, (u'ref,1,v'ref,1), (u'ref,2,v'ref,2), and (u'ref,3,v'ref,3) in the referential space and the corresponding apexes, (u'adj,1,v'adj,1), (u'adj,2,v'adj,2), and (u'adj,1,v'adj,2) in adjacent retina spaces for a target triangle facet in object space, their relationship is written as

(a11a12a21a22)(u'ref,1v'ref,1)+(b1b2)=(u'adj,1v'adj,1),
(a11a12a21a22)(u'ref,2v'ref,2)+(b1b2)=(u'adj,2v'adj,2),
(a11a12a21a22)(u'ref,3v'ref,3)+(b1b2)=(u'adj,3v'adj,3).
Equations (25)-(27) are combined as a matrix equation
(u'ref,1v'ref,1001000u'ref,1v'ref,101u'ref,2v'ref,2001000u'ref,2v'ref,201u'ref,3v'ref,3001000u'ref,3v'ref,301)(a11a12a21a22b1b2)=(u'adj,1v'adj,1u'adj,2v'adj,2u'adj,3v'adj,3),
and it is solved to produce the conversion relationship between the two local coordinates of the referential and adjacent retina spaces as
(a11a12a21a22)(u'refv'ref)+(b1b2)=(u'adjv'adj).
In the previous section, how the uniform grid on the local coordinates of object space is non-linearly mapped to that of retina space was established. From Fig. 3(b), the mapped grids on the local coordinates of the left and right eyes have different aspects because their positions and view directions are apparently dissimilar. However, if both eyes are located near to each other or their view directions are not significantly different, we are able to define the conversion relationship among their local coordinates using Eq. (29). It should be noted that there must be some errors in this assumption, which will continue to be estimated.

The validity of the affine interocular similarity is analyzed with a numerical simulation, in which it is assumed that the observer watches the center of a triangular facet lying in the xy plane while moving in the designated observation section as shown in Fig. 5(a). Four observation sections are set up which are designated in terms of longitudinal angle θ(0, 15, 30, and 45 degrees) azimuthal angle ϕ(−15 to 15 degrees). It is also assumed that the reference points of each observation section are located in the middle of each section.

 figure: Fig. 5

Fig. 5 (a) Schematic diagram for analyzing interocular similarity and (b) the comparison of two grids calculated by the exact and approximate method.

Download Full Size | PPT Slide | PDF

Under these circumstances, let us use an example to clarify the purpose of this simulation. When the observer is located at θ=15° and ϕ=15°, we can determine how a uniform grid on the local coordinate of the triangular facet in the object space is mapped to the non-uniform grid of retina space. There are two ways to represent the non-uniform grid in retina space. The first is the exact method using (u(x,y),v(x,y),0). The other is the approximation method using the affine transformation of Eq. (29).

The accuracy and tolerance of the approximate method is examined by comparing it with the exact method. Figure 5(b) presents two overlapping grids calculated by the exact and approximate methods, colored red and blue, respectively. It can be observed that the overall shapes of the two grids are similar. However, there is a small difference between them around the outer edge, indicated by the shaded area A in Fig. 5(b). The effective portion in the total grid is eventually restricted to a finite interior area of the triangle in the local coordinates of retina space. In the shaded area B of Fig. 5(b), the two grids closely match around the center where the triangle is located.

The validity of the approximate method is evaluated using the simulation analysis shown in Fig. 6. After deriving two grids using the exact and approximate methods for a particular view point, we calculated the root mean square error (RMSE). RMSE graphs were then constructed from the results of the following two cases: (1) RMSE for all parts of the two grids and (2) RMSE for the interior region of the triangle, as shown in Figs. 6(a) and 6(b), respectively. The RMSE of the case (1) would be larger than that of the case (2). In the calculation of RMSE, all values of the cases (1) and (2) are normalized with the maximum value of the case (1).

 figure: Fig. 6

Fig. 6 Analysis results for interocular similarity: (a) two comparison grids calculated by the exact and approximate methods, (b) the RMS error graph for total area, and (c) the RMS error for the interior area of the triangle.

Download Full Size | PPT Slide | PDF

The RMSE tends to increase exponentially as an observation point moves further from the reference point in the ϕ direction. When fixing ϕ as one value, the RMSE also increase proportionally with θ. Thus, we need to consider the applicable scope including a reference point and its adjacent points before we apply this proposed method to calculate multi-view CGHs. However, the RMSE for the case (2) is definitely smaller than that of the case (1). It means that the approximation method is sufficiently reliable if the triangle is small enough to be covered by the affine transform under a reasonable tolerance. In practice, the unit triangle facets that make up a 3D object are sufficiently small to represent it accurately with triangle meshes. If the unit triangles are too large to exploit the proposed method, the triangle facet should be divided into smaller triangles.

4. Affine-similarity transformation of holographic three-dimensional image light fields

In the polygon CGH synthesis theory [15, 16] that we have developed in previous papers, CGH patterns are obtained by propagating the observable holographic image in the retina plane into the CGH plane through the inverse cascaded generalized Fresnel transform [15]. Therefore, when we calculate CGH patterns, the majority of the computation time is used in obtaining the holographic image in the retina plane. A complex process is required to calculate the observed image in the retina plane because light field distributions, which are emitted by all of the unit triangles that msake up the 3D object, have to be synthesized in the retina plane. In particular, for multi-view CGH calculations, the computational complexity can be exceptionally high because it is proportionate to the number of views that will be recorded in the CGH pattern. However, the similarity due to the affine transformation can be exploited to significantly improve multi-view CGH calculation.

In this section, an affine-reconfigured polygon CGH is formulated and the validity of the affine approximation and its effect on improving efficiency are tested with a comparison to an exact re-computed CGH model. The approximate light field distribution of the adjacent retina space is derived by referring to that of the referential retina space. First, the angular spectrum representations of the triangular facets P'ref,1P'ref,2P'ref,3 and P'adj,1P'adj,2P'adj,3 are given in the retinal space as

F(u'ref,y'ref)=Aref(α'ref,β'ref)exp[j2π(α'refu'ref+β'refv'ref)]dα'refdβ'ref,
and
G(u'adj,v'adj)=Aadj(α'adj,β'adj)exp[j2π(α'adju'adj+β'adjv'adj)]dα'adjdβ'adj.
The mathematical relationship of F(u'ref,v'ref) and G(u'adj,v'adj) is based on the affine transformation. The affine transformation solves (u'adj,v'adj) for (u'ref,v'ref) as
(u'adjv'adj)=(a11a12a21a22)(u'refv'ref)+(b1b2).
The substitution of Eq. (32) into Eq. (31) leads to
G(a11u'ref+a12v'ref+b1,a21u'ref+a22v'ref+b2)=Aadj(α'adj,β'adj)×exp{j2π[α'adj(a11u'ref+a12v'ref+b1)+β'adj(a21u'ref+a22v'ref+b2)]}dα'adjdβ'adj={Aadj(α'adj,β'adj)exp[j2π(α'adjb1+β'adjb2)]}×exp{j2π[(a11α'adj+a21β'adj)u'ref+(a12α'adj+a22β'adj)v'ref]}dα'adjdβ'adj.
Let α'ref and β'ref be
(α'refβ'ref)=(a11α'adj+a21β'adja12α'adj+a22β'adj),
the differential area dα'refdβ'ref is then given by dα'refdβ'ref=(a11a22a12a21)dα'adjdβ'adj. Therefore the equality of F(uref,vref)=G(a11u'ref+a12v'ref+b1,a21u'ref+a22v'ref+b2) leads to the equality of the angular spectrum integrals as
F(uref,vref)=Aref(α'ref,β'ref)exp[j2π(α'refu'ref+β'refv'ref)]dα'refdβ'ref=Aref(α'ref,β'ref)exp[j2π(α'refu'ref+β'refv'ref)](a11a22a12a21)dα'adjdβ'adj=G(a11u'ref+a12v'ref+b1,a21u'ref+a22v'ref+b2)=Aadj(α'adj,β'adj)exp[j2π(α'adjb1+β'adjb2)]exp[j2π(α'refu'ref+β'refv'ref)]dα'adjdβ'adj.
Letting F(uref,vref) equal to G(a11u'ref+a12v'ref+b1,a21u'ref+a22v'ref+b2), the angular spectrum of the adjacent local field Aadj(α'adj,β'adj) is solved by
Aadj(α'adj,β'adj)=exp[j2π(α'adjb1+β'adjb2)](a11a22a12a21)Aref(α'ref,β'ref)=exp[j2π(α'adjb1+β'adjb2)](a11a22a12a21)Aref(a11α'adj+a21β'adj,a12α'adj+a22β'adj).
The result of Eq. (36) certifies that the angular spectrum of the adjacent local field Aadj(α'adj,β'adj) is calculated from the geometric transformation of Aref(α'ref,β'ref). The angular spectrum representation of the adjacent local field G(u'adj,v'adj) is represented by
G(u'adj,v'adj)=exp[j2π(α'adjb1+β'adjb2)](a11a22a12a21)Aref(a11α'adj+a21β'adj,a12α'adj+a22β'adj)×exp[j2π(α'adju'adj+β'adjv'adj)]dα'adjdβ'adj=Aadj(α'adj,β'adj)exp[j2π(α'adju'adj+β'adjv'adj)]dα'adjdβ'adj.
In addition, the spatial coordinate variables (u'adj,v'adj,w'adj) in the adjacent local coordinates is connected to the spatial coordinate variables (uadj,vadj,wadj) in the adjacent global coordinates by GtoL as
(uadjvadjwadj)=(cosθadj0sinθadj010sinθadj0cosθadj)×(cosϕadjsinϕadj0sinϕadjcosϕadj0001)(uadjuadj,cvadjvadj,cwadjwadj,c)=(cosθadjcosϕadjcosθadjsinϕadjsinθadjsinϕadjcosϕadj0sinθadjcosϕadjsinθadjsinϕadjcosθadj)(uadjuadj,cvadjvadj,cwadjwadj,c).
The point (uadj,c,vadj,c,wadj,c) is the center of mass of Padj,1Padj,2Padj,3 in the adjacent global coordinate located at its origin. Let us set a carrier plane wave Γ=η0exp[j2π(α0uadj+β0vadj+γ0wadj)] incident to the unit triangle aperture in the adjacent global coordinate system, the illuminating plane wave in the adjacent local coordinate system is obtained by the inverse transformation of Eq. (38) as Γ=η0exp{j2π[α'0(u'adj+u'adj,c)+β0(vadj+vadj,c)+γ0(w'adj+w'adj,c)]}. By multiplying this illuminating plane wave to Eq. (37), the light field distribution on the unit triangular facet in the adjacent local coordinate is obtained as
W(uadj,vadj,0)=η0exp{j2π[α0(uadj+uadj,c)+β0(uadj+uadj,c)+γ0'wadj,c]}×Aadj@L(αadj,βadj)exp[j2π(αadjuadj+βadjvadj)]dαadjdβadj=η0exp[j2π(α0uadj,c+β0uadj,c+γ0'wadj,c)]×Aadj@L(αadjα0,βadjβ0)exp[j2π(αadjuadj+βadjvadj)]dαadjdβadj.
To convey the meaning of the notation more clearly, we attach notation @L behind the adj in Aadj(αadj,βadj) of Eq. (36). It means that Aadj@L(αadj,βadj)is the angular spectrum of the adjacent local field, and in the same way, we use @G to designate the terms of the global coordinate system. The light field distribution W(u'adj,v'adj,w'adj) for the entire space of the adjacent local coordinate is obtained by
W(uadj,vadj,wadj)=η0exp[j2π(α0uadj,c+β0vadj,c+γ0wadj,c)]×Aadj@L(αadjα0,βadjβ0)exp[j2π(αadjuadj+βadjvadj+γadjwadj)]dαadjdβadj.
From Eq. (38), the components of the Fourier spatial frequency vector (α'adj,β'adj,γ'adj) in the adjacent local coordinates also have the conversion relationship by the Fourier spatial frequency vector (αadj,βadj,γadj) of the adjacent global coordinate system as
αadj(αadj,βadj)=cosθadjcosϕadjαadj+cosθadjsinϕadjβadjsinθadjγadj,
βadj(αadj,βadj)=sinϕadjαadj+cosϕadjβadj,
γadj(αadj,βadj)=sinθadjcosϕadjαadj+sinθadjsinϕadjβadj+cosθadjγadj.
The differential area in the adjacent local coordinate system dα'adjdβ'adj is thus given by
dαadj(αadj,βadj)dβadj(αadj,βadj)=|J|dαadjdβadj=|cosθadj+sinθadj(αadjcosϕadj+βadjsinϕadj)/γadj|dαadjdβadj.
By substituting Eqs. (41), (42) and (44) into Eq. (40), we can derive the diffraction field in the adjacent global coordinate system W(uadj,vadj,wadj)
W(uadj,vadj,wadj)=η0exp[j2π(α0uadj,c+β0vadj,c+γ0wadj,c)]×Aadj@L(αadj(αadj,βadj)α0(α0,β0),βadj(αadj,βadj)β0(α0,β0))H(γadj(αadj,βadj))×exp{j2π[αadj(uadjuadj,c)+βadj(vadjvadj,c)+γadj(wadjwadj,c)]}×|cosθadj+sinθadj(αadjcosϕadj+βadjsinϕadj)/γadj|dαadjdβadj.
When calculating Eq. (45), the condition γadj(αadj,βadj)>0has to be satisfied; angular spectrum values at any frequency that do not satisfy this condition must be zero. Accordingly, the unit step function H(γadj(αadj,βadj))is contained in Eq. (45) [12]. The angular spectrum in the adjacent global field Aadj@G(αadj,βadj) is represented as
Aadj@G(αadj,βadj)=η0exp[j2π(α0uadj,c+β0vadj,c+γ0wadj,c)]×Aadj@L(α'adj(αadj,βadj)α'0(α0,β0),β'adj(αadj,βadj)β'0(αo,β0))×H(γ'adj(αadj,βadj))exp{j2π[αadj(uadj,c)+βadj(vadj,c)+γadj(wadj,c)]}×|cosθadj+sinθadj(αadjcosϕadj+βadjsinϕadj)/γadj|.
From Eq. (36), Aadj@L(α'adj(αadj,βadj)α'0(α0,β0),β'adj(αadj,βadj)β'0(αo,β0))is manipulated as
Aadj@L(α'adj(αadj,βadj)α'0(α0,β0),β'adj(αadj,βadj)β'0(αo,β0))=Aadj@L(α''adj,β''adj)=exp[j2π(α''adjb1+β''adjb2)](a11a22a12a21)×Aref(a11α''adj+a21β''adj,a12α''adj+a22β''adj),
where α''adj and β''adj are defined by, respectively
α''adj=α'adj(αadj,βadj)α'0(α0,β0),
β''adj=β'adj(αadj,βadj)β'0(αo,β0).
Finally, by substituting Eqs. (47), (48) and (49) into Eq. (46), the angular spectrum of the adjacent global fieldAadj@G(αadj,βadj)is solved for Aref,
Aadj@G(αadj,βadj)=η0exp[j2π(α0uadj,c+β0vadj,c+γ0wadj,c)]H(γ'adj(αadj,βadj))×exp{j2π[αadj(uadj,c)+βadj(vadj,c)+γadj(wadj,c)]}×|cosθr+sinθr(αrcosϕr+βrsinϕr)/γr|exp[j2π(α''adjb1+β''adjb2)]×(a11a22a12a21)Aref(a11α''adj+a21β''adj,a12α''adj+a22β''adj).
This means that the angular spectrum of the triangle facet in the adjacent global coordinate system is calculated from that of the referential local coordinate system. Therefore, we establish the CGH computation process as follows: (1) prepare the primitive angular spectrum of the referential local coordinate system in advance, (2) compute the light field distribution of the adjacent global coordinate system by reconfiguring the primitive angular spectrum data, and (3) using the inverse cascaded generalized Fresnel transform from the retinal plane to the CGH plane [13, 15], convert the light field distribution in the retina plane to the CGH pattern. The intermediate view CGH is not generated by re-computing the entire process, but by reconfiguring the primitive data of the reference observation point. This process is expected to significantly reduce the computational complexity of wide-viewing angle polygon CGHs.

To assess the efficiency of the proposed method, we compared the computing time for a full-color CGH using the exact and approximate methods. In the calculation of the full-color CGH, the red (633nm), green (532nm) and blue (473nm) components of the CGH were independently calculated without color dispersion [13]. Similar to Fig. 4, we assumed that a textured triangular facet was floating in object space and an observer is looking at it from a specific location. This computation is performed in MATLAB using a workstation with 2.27GHz Intel Xeon E5520 CPU and 48GByte memory. The size of the single view CGH is 2,201×2,201. Figure 7 displays the simulation results. Using both methods under the same computational conditions, we simulated the observer looking at specific objects while moving around them. As shown in Fig. 9, the textured cube is floating 5mm above the checker board and the observer is looking at this scene along a diagonal direction toward the floating object.

 figure: Fig. 7

Fig. 7 The observation image of a full-color CGH calculated using (a) the exact method and (b) the approximate method.

Download Full Size | PPT Slide | PDF

We assume the observer’s rotational range is as 0 to 360 degrees in the azimuthal direction and with an interval of 1 degree. Thus, 360 light field distributions should be calculated for each viewpoint. To accomplish this simulation, 360 times re-computations are required using the exact method. As indicated in Fig. 8(a), the exact method has two steps: (1) obtaining a properly distorted texture pattern on the local coordinates in the observer’s retina space and (2) numerically calculating the angular spectrum using a fast Fourier transform (FFT) algorithm and interpolation. The entire process takes 11.8513 seconds. On the other hand, the approximate method has three steps: (1) obtaining the properly distorted texture pattern on the local coordinates in the referential retina space, (2) calculating its angular spectrum with the FFT (this result is regarded as the primitive data) and (3) obtaining the angular spectrum by reconfiguring the primitive data as indicated in Fig. 8(b).

 figure: Fig. 8

Fig. 8 The elapsed time for calculating the CGHs using (a) the exact method and (b) the approximate method.

Download Full Size | PPT Slide | PDF

Although the approximate method has one more step than the exact method, the computation time of the entire process is much shorter at 6.397 seconds. The efficiency of the new method would be even more dramatic for multi-view CGH calculations because the step (3) of the approximate method is only implemented to calculate the new angular spectrum of the adjacent observation point if the primitive data is pre-calculated. Using the exact method, however, the entire process must be implemented each time. Therefore, the computation times for the exact and approximate methods are 11.8513 and 1.9773 sec for one cycle, respectively. As described above, however, we can efficiently calculate the CGH with the approximation method. In this case, we assume an applicable range of 20 degrees to apply the proposed algorithm. For example, one section covers 35° to 55° if the reference point is located at ϕ=45°. Therefore, 18 observation sections are required and the light field distribution of each observation point is approximately calculated by reconfiguring the angular spectrum of its referential local coordinates.

Figure 9 displays the simulation results verifying the accommodation effect of the CGH computed by the affine approximate method. If the eye lens focuses on a particular object, that object is clearly recognized while other objects are blurred. The observed accommodation effect in Fig. 9 proves that the approximate method is accurate with its error not observable.

 figure: Fig. 9

Fig. 9 Verifying the properties of a 3D holographic image with the accommodation effect: (a) when focusing on the cube and (b) when focusing on the checker board.

Download Full Size | PPT Slide | PDF

5. Concluding remarks

In conclusion, we have presented a concrete theory for interocular similarity and proposed an inter-view reconfiguration algorithm for textured polygon CGHs for view-direction change using an approximate affine transform. The effectiveness and efficiency of the approximate affine transform was proven with a numerical simulation, in which the reconfiguration algorithm based on the affine transform was applied to accelerate the computation of intermediate view CGHs for multi-view polygon CGHs. This work fllas under the umbrella of holographic information theory, an emerging field of optical information processing, which is a crucial component of next generation holographic 3D display technology.

Appendix

In the Appendix, we prove the relationship between the local coordinates of a triangular facet in the original global coordinate and the adaptive global systems shown in Eq. (24). The local coordinates of a triangular facet in the adaptive global coordinates (x'n,y'n,z'n) are solved for its global coordinates (xn,yn,zn) as

(xnynzn)=(cosθncosϕncosθnsinϕnsinθnsinϕncosϕn0sinθncosϕnsinθnsinϕncosθn)(xnxncynyncznznc)=GtoLn(xnxncynyncznznc),
where θn and ϕn are the longitudinal angle and azimuthal angle, respectively, representing the rotated degree of the local coordinates relative to its global coordinates. This transformation is denoted as adaptive global to local transformation (GtoLn). The origin of the adaptive local coordinates corresponds to the centroid of the triangular facet in the adaptive global coordinates (xnc,ync,znc). Equation (51) can be modified by
(xnynzn)=GtoLn[(xnxn0ynyn0znzn0)+(xn0xncyn0ynczn0znc)],
where (xn0,yn0,zn0) is the projection center in the adaptive global coordinates. By substituting Eq. (7) into Eq. (52), we can get the following Eq. (53)
(xnynzn)=GtoLn[R(xx0yy0zz0)+(xn0xncyn0ynczn0znc)].
The original global coordinates(x,y,z) can be represented for its local coordinate (x',y',z') by original local to global transformation (LtoG), where LtoG is given by
(xxcyyczzc)=(cosθcosϕsinϕcosϕsinθsinϕcosθcosϕsinϕsinθsinθ0cosθ)(x'y'z')=LtoG(x'y'z').
Equation (53) can be expanded using Eq. (54) as
(xnynzn)=GtoLn{R[LtoG(x'y'z')+(xcx0ycy0zcz0)]+(xn0xncyn0ynczn0znc)}=GtoLnRLtoG(x'y'z')+GtoLn[R(xcx0ycy0zcz0)+(xn0xncyn0ynczn0znc)]=GtoLnRLtoG(x'y'z'),
where R(xcx0ycy0zcz0) is canceled out by (xn0xncyn0ynczn0znc). As a result of Eq. (55), we can finally obtain the relationship of the local coordinates of a triangular facet in the original and adaptive global system.

Funding

Samsung Future Technology Fund of Samsung Electronics Inc. (SRFC-IT1301-52).

References and links

1. J. Hong, Y. Kim, H.-J. Choi, J. Hahn, J.-H. Park, H. Kim, S.-W. Min, N. Chen, and B. Lee, “Three-dimensional display technologies of recent interest: principles, status, and issues,” Appl. Opt. 50(34), H87–H115 (2011). [CrossRef]   [PubMed]  

2. J.-H. Park, “Recent progress in computer-generated holography for three-dimensional scenes,” J. Inform. Displ. 18(1), 1–12 (2017). [CrossRef]  

3. S.-C. Kim, J.-M. Kim, and E.-S. Kim, “Effective memory reduction of the novel look-up table with one-dimensional sub-principle fringe patterns in computer-generated holograms,” Opt. Express 20(11), 12021–12034 (2012). [CrossRef]   [PubMed]  

4. H. Kang, E. Stoykova, and H. Yoshikawa, “Fast phase-added stereogram algorithm for generation of photorealistic 3D content,” Appl. Opt. 55(3), A135–A143 (2016). [CrossRef]   [PubMed]  

5. T. Shimobaba, H. Nakayama, N. Masuda, and T. Ito, “Rapid calculation algorithm of Fresnel computer-generated-hologram using look-up table and wavefront-recording plane methods for three-dimensional display,” Opt. Express 18(19), 19504–19509 (2010). [CrossRef]   [PubMed]  

6. T. Shimobaba and T. Ito, “Fast generation of computer-generated holograms using wavelet shrinkage,” Opt. Express 25(1), 77–87 (2017). [CrossRef]   [PubMed]  

7. S. Jiao, Z. Zhuang, and W. Zou, “Fast computer generated hologram calculation with a mini look-up table incorporated with radial symmetric interpolation,” Opt. Express 25(1), 112–123 (2017). [CrossRef]   [PubMed]  

8. A. Symeonidou, D. Blinder, and P. Schelkens, “Colour computer-generated holography for point clouds utilizing the Phong illumination model,” Opt. Express 26(8), 10282–10298 (2018). [CrossRef]   [PubMed]  

9. T. Ichikawa, K. Yamaguchi, and Y. Sakamoto, “Realistic expression for full-parallax computer-generated holograms with the ray-tracing method,” Appl. Opt. 52(1), A201–A209 (2013). [CrossRef]   [PubMed]  

10. T. Ichikawa, T. Yoneyama, and Y. Sakamoto, “CGH calculation with the ray tracing method for the Fourier transform optical system,” Opt. Express 21(26), 32019–32031 (2013). [CrossRef]   [PubMed]  

11. K. Wakunami and M. Yamaguchi, “Calculation for computer generated hologram using ray-sampling plane,” Opt. Express 19(10), 9086–9101 (2011). [CrossRef]   [PubMed]  

12. Y. Zhao, L. Cao, H. Zhang, D. Kong, and G. Jin, “Accurate calculation of computer-generated holograms using angular-spectrum layer-oriented method,” Opt. Express 23(20), 25440–25449 (2015). [CrossRef]   [PubMed]  

13. J. Roh, K. Kim, E. Moon, S. Kim, B. Yang, J. Hahn, and H. Kim, “Full-color holographic projection display system featuring an achromatic Fourier filter,” Opt. Express 25(13), 14774–14782 (2017). [CrossRef]   [PubMed]  

14. T. Senoh, Y. Ichihashi, R. Oi, H. Sasaki, and K. Yamamoto, “Study on a holographic TV system based on multi-view images and depth maps,” Proc. SPIE 8644, 86440A (2013).

15. H. Kim, J. Hahn, and B. Lee, “Mathematical modeling of triangle-mesh-modeled three-dimensional surface objects for digital holography,” Appl. Opt. 47(19), D117–D127 (2008). [CrossRef]   [PubMed]  

16. D. Im, E. Moon, Y. Park, D. Lee, J. Hahn, and H. Kim, “Phase-regularized polygon computer-generated holograms,” Opt. Lett. 39(12), 3642–3645 (2014). [CrossRef]   [PubMed]  

17. S.-B. Ko and J.-H. Park, “Speckle reduction using angular spectrum interleaving for triangular mesh based computer generated hologram,” Opt. Express 25(24), 29788–29797 (2017). [CrossRef]   [PubMed]  

18. K. Matsushima, “Computer-generated holograms for electro-holography,” Appl. Opt. 44, 4607–4614 (2005). [CrossRef]   [PubMed]  

19. K. Matsushima and S. Nakahara, “Extremely high-definition full-parallax computer-generated hologram created by the polygon-based method,” Appl. Opt. 48(34), H54–H63 (2009). [CrossRef]   [PubMed]  

20. Y. Tsuchiyama and K. Matsushima, “Full-color large-scaled computer-generated holograms using RGB color filters,” Opt. Express 25(3), 2016–2030 (2017). [CrossRef]   [PubMed]  

21. K. Matsushima and N. Sonobe, “Full-color digitized holography for large-scale holographic 3D imaging of physical and nonphysical objects,” Appl. Opt. 57(1), A150–A156 (2018). [CrossRef]   [PubMed]  

22. J.-H. Park, S.-B. Kim, H.-J. Yeom, H.-J. Kim, H. Zhang, B. Li, Y.-M. Ji, S.-H. Kim, and S.-B. Ko, “Continuous shading and its fast update in fully analytic triangular-mesh-based computer generated hologram,” Opt. Express 23(26), 33893–33901 (2015). [CrossRef]   [PubMed]  

23. W. Lee, D. Im, J. Paek, J. Hahn, and H. Kim, “Semi-analytic texturing algorithm for polygon computer-generated holograms,” Opt. Express 22(25), 31180–31191 (2014). [CrossRef]   [PubMed]  

24. L. Ahrenberg, P. Benzie, M. Magnor, and J. Watson, “Computer generated holography using parallel commodity graphics hardware,” Opt. Express 14(17), 7636–7641 (2006). [CrossRef]   [PubMed]  

25. G. Li, K. Hong, J. Yeom, N. Chen, J.-H. Park, N. Kim, and B. Lee, “Acceleration method for computer-generated spherical hologram calculation of real objects using graphics processing unit,” Chin. Opt. Lett. 12(6), 060016 (2014). [CrossRef]  

26. T. Shimobaba, T. Ito, N. Masuda, Y. Ichihashi, and N. Takada, “Fast calculation of computer-generated-hologram on AMD HD5000 series GPU and OpenCL,” Opt. Express 18(10), 9955–9960 (2010). [CrossRef]   [PubMed]  

27. Y.-H. Seo, H.-J. Choi, J.-S. Yoo, and D.-W. Kim, “Cell-based hardware architecture for full-parallel generation algorithm of digital holograms,” Opt. Express 19(9), 8750–8761 (2011). [CrossRef]   [PubMed]  

28. N. Takada, T. Shimobaba, H. Nakayama, A. Shiraki, N. Okada, M. Oikawa, N. Masuda, and T. Ito, “Fast high-resolution computer-generated hologram computation using multiple graphics processing unit cluster system,” Appl. Opt. 51(30), 7303–7307 (2012). [CrossRef]   [PubMed]  

29. J. Cho, J. Hahn, and H. Kim, “Fast reconfiguration algorithm of computer generated holograms for adaptive view direction change in holographic three-dimensional display,” Opt. Express 20(27), 28282–28291 (2012). [CrossRef]   [PubMed]  

30. D. Im, J. Cho, J. Hahn, B. Lee, and H. Kim, “Accelerated synthesis algorithm of polygon computer-generated holograms,” Opt. Express 23(3), 2863–2871 (2015). [CrossRef]   [PubMed]  

31. Y. Pan, Y. Wang, J. Liu, X. Li, and J. Jia, “Fast polygon-based method for calculating computer-generated holograms in three-dimensional display,” Appl. Opt. 52(1), A290–A299 (2013). [CrossRef]   [PubMed]  

32. A. W. Lohman, R. G. Dorsch, D. Mendlovic, Z. Zalevsky, and C. Ferreira, “Space-bandwidth product of optical signals and systems,” J. Opt. Soc. Am. A 13(3), 470–473 (1996). [CrossRef]  

33. J. Hahn, H. Kim, Y. Lim, G. Park, and B. Lee, “Wide viewing angle dynamic holographic stereogram with a curved array of spatial light modulators,” Opt. Express 16(16), 12372–12386 (2008). [CrossRef]   [PubMed]  

34. H. Kim, J. Hahn, and B. Lee, “Image volume analysis of omnidirectional parallax regular-polyhedron three-dimensional displays,” Opt. Express 17(8), 6389–6396 (2009). [CrossRef]   [PubMed]  

35. Y. Lim, K. Hong, H. Kim, H.-E. Kim, E.-Y. Chang, S. Lee, T. Kim, J. Nam, H.-G. Choo, J. Kim, and J. Hahn, “360-degree tabletop electronic holographic display,” Opt. Express 24(22), 24999–25009 (2016). [CrossRef]   [PubMed]  

36. T. Inoue and Y. Takaki, “Table screen 360-degree holographic display using circular viewing-zone scanning,” Opt. Express 23(5), 6533–6542 (2015). [CrossRef]   [PubMed]  

37. Y. Takaki and S. Uchida, “Table screen 360-degree three-dimensional display using a small array of high-speed projectors,” Opt. Express 20(8), 8848–8861 (2012). [CrossRef]   [PubMed]  

38. T. Kakue, T. Nishitsuji, T. Kawashima, K. Suzuki, T. Shimobaba, and T. Ito, “Aerial projection of three-dimensional motion pictures by electro-holography and parabolic mirrors,” Sci. Rep. 5(1), 11750 (2015). [CrossRef]   [PubMed]  

References

  • View by:
  • |
  • |
  • |

  1. J. Hong, Y. Kim, H.-J. Choi, J. Hahn, J.-H. Park, H. Kim, S.-W. Min, N. Chen, and B. Lee, “Three-dimensional display technologies of recent interest: principles, status, and issues,” Appl. Opt. 50(34), H87–H115 (2011).
    [Crossref] [PubMed]
  2. J.-H. Park, “Recent progress in computer-generated holography for three-dimensional scenes,” J. Inform. Displ. 18(1), 1–12 (2017).
    [Crossref]
  3. S.-C. Kim, J.-M. Kim, and E.-S. Kim, “Effective memory reduction of the novel look-up table with one-dimensional sub-principle fringe patterns in computer-generated holograms,” Opt. Express 20(11), 12021–12034 (2012).
    [Crossref] [PubMed]
  4. H. Kang, E. Stoykova, and H. Yoshikawa, “Fast phase-added stereogram algorithm for generation of photorealistic 3D content,” Appl. Opt. 55(3), A135–A143 (2016).
    [Crossref] [PubMed]
  5. T. Shimobaba, H. Nakayama, N. Masuda, and T. Ito, “Rapid calculation algorithm of Fresnel computer-generated-hologram using look-up table and wavefront-recording plane methods for three-dimensional display,” Opt. Express 18(19), 19504–19509 (2010).
    [Crossref] [PubMed]
  6. T. Shimobaba and T. Ito, “Fast generation of computer-generated holograms using wavelet shrinkage,” Opt. Express 25(1), 77–87 (2017).
    [Crossref] [PubMed]
  7. S. Jiao, Z. Zhuang, and W. Zou, “Fast computer generated hologram calculation with a mini look-up table incorporated with radial symmetric interpolation,” Opt. Express 25(1), 112–123 (2017).
    [Crossref] [PubMed]
  8. A. Symeonidou, D. Blinder, and P. Schelkens, “Colour computer-generated holography for point clouds utilizing the Phong illumination model,” Opt. Express 26(8), 10282–10298 (2018).
    [Crossref] [PubMed]
  9. T. Ichikawa, K. Yamaguchi, and Y. Sakamoto, “Realistic expression for full-parallax computer-generated holograms with the ray-tracing method,” Appl. Opt. 52(1), A201–A209 (2013).
    [Crossref] [PubMed]
  10. T. Ichikawa, T. Yoneyama, and Y. Sakamoto, “CGH calculation with the ray tracing method for the Fourier transform optical system,” Opt. Express 21(26), 32019–32031 (2013).
    [Crossref] [PubMed]
  11. K. Wakunami and M. Yamaguchi, “Calculation for computer generated hologram using ray-sampling plane,” Opt. Express 19(10), 9086–9101 (2011).
    [Crossref] [PubMed]
  12. Y. Zhao, L. Cao, H. Zhang, D. Kong, and G. Jin, “Accurate calculation of computer-generated holograms using angular-spectrum layer-oriented method,” Opt. Express 23(20), 25440–25449 (2015).
    [Crossref] [PubMed]
  13. J. Roh, K. Kim, E. Moon, S. Kim, B. Yang, J. Hahn, and H. Kim, “Full-color holographic projection display system featuring an achromatic Fourier filter,” Opt. Express 25(13), 14774–14782 (2017).
    [Crossref] [PubMed]
  14. T. Senoh, Y. Ichihashi, R. Oi, H. Sasaki, and K. Yamamoto, “Study on a holographic TV system based on multi-view images and depth maps,” Proc. SPIE 8644, 86440A (2013).
  15. H. Kim, J. Hahn, and B. Lee, “Mathematical modeling of triangle-mesh-modeled three-dimensional surface objects for digital holography,” Appl. Opt. 47(19), D117–D127 (2008).
    [Crossref] [PubMed]
  16. D. Im, E. Moon, Y. Park, D. Lee, J. Hahn, and H. Kim, “Phase-regularized polygon computer-generated holograms,” Opt. Lett. 39(12), 3642–3645 (2014).
    [Crossref] [PubMed]
  17. S.-B. Ko and J.-H. Park, “Speckle reduction using angular spectrum interleaving for triangular mesh based computer generated hologram,” Opt. Express 25(24), 29788–29797 (2017).
    [Crossref] [PubMed]
  18. K. Matsushima, “Computer-generated holograms for electro-holography,” Appl. Opt. 44, 4607–4614 (2005).
    [Crossref] [PubMed]
  19. K. Matsushima and S. Nakahara, “Extremely high-definition full-parallax computer-generated hologram created by the polygon-based method,” Appl. Opt. 48(34), H54–H63 (2009).
    [Crossref] [PubMed]
  20. Y. Tsuchiyama and K. Matsushima, “Full-color large-scaled computer-generated holograms using RGB color filters,” Opt. Express 25(3), 2016–2030 (2017).
    [Crossref] [PubMed]
  21. K. Matsushima and N. Sonobe, “Full-color digitized holography for large-scale holographic 3D imaging of physical and nonphysical objects,” Appl. Opt. 57(1), A150–A156 (2018).
    [Crossref] [PubMed]
  22. J.-H. Park, S.-B. Kim, H.-J. Yeom, H.-J. Kim, H. Zhang, B. Li, Y.-M. Ji, S.-H. Kim, and S.-B. Ko, “Continuous shading and its fast update in fully analytic triangular-mesh-based computer generated hologram,” Opt. Express 23(26), 33893–33901 (2015).
    [Crossref] [PubMed]
  23. W. Lee, D. Im, J. Paek, J. Hahn, and H. Kim, “Semi-analytic texturing algorithm for polygon computer-generated holograms,” Opt. Express 22(25), 31180–31191 (2014).
    [Crossref] [PubMed]
  24. L. Ahrenberg, P. Benzie, M. Magnor, and J. Watson, “Computer generated holography using parallel commodity graphics hardware,” Opt. Express 14(17), 7636–7641 (2006).
    [Crossref] [PubMed]
  25. G. Li, K. Hong, J. Yeom, N. Chen, J.-H. Park, N. Kim, and B. Lee, “Acceleration method for computer-generated spherical hologram calculation of real objects using graphics processing unit,” Chin. Opt. Lett. 12(6), 060016 (2014).
    [Crossref]
  26. T. Shimobaba, T. Ito, N. Masuda, Y. Ichihashi, and N. Takada, “Fast calculation of computer-generated-hologram on AMD HD5000 series GPU and OpenCL,” Opt. Express 18(10), 9955–9960 (2010).
    [Crossref] [PubMed]
  27. Y.-H. Seo, H.-J. Choi, J.-S. Yoo, and D.-W. Kim, “Cell-based hardware architecture for full-parallel generation algorithm of digital holograms,” Opt. Express 19(9), 8750–8761 (2011).
    [Crossref] [PubMed]
  28. N. Takada, T. Shimobaba, H. Nakayama, A. Shiraki, N. Okada, M. Oikawa, N. Masuda, and T. Ito, “Fast high-resolution computer-generated hologram computation using multiple graphics processing unit cluster system,” Appl. Opt. 51(30), 7303–7307 (2012).
    [Crossref] [PubMed]
  29. J. Cho, J. Hahn, and H. Kim, “Fast reconfiguration algorithm of computer generated holograms for adaptive view direction change in holographic three-dimensional display,” Opt. Express 20(27), 28282–28291 (2012).
    [Crossref] [PubMed]
  30. D. Im, J. Cho, J. Hahn, B. Lee, and H. Kim, “Accelerated synthesis algorithm of polygon computer-generated holograms,” Opt. Express 23(3), 2863–2871 (2015).
    [Crossref] [PubMed]
  31. Y. Pan, Y. Wang, J. Liu, X. Li, and J. Jia, “Fast polygon-based method for calculating computer-generated holograms in three-dimensional display,” Appl. Opt. 52(1), A290–A299 (2013).
    [Crossref] [PubMed]
  32. A. W. Lohman, R. G. Dorsch, D. Mendlovic, Z. Zalevsky, and C. Ferreira, “Space-bandwidth product of optical signals and systems,” J. Opt. Soc. Am. A 13(3), 470–473 (1996).
    [Crossref]
  33. J. Hahn, H. Kim, Y. Lim, G. Park, and B. Lee, “Wide viewing angle dynamic holographic stereogram with a curved array of spatial light modulators,” Opt. Express 16(16), 12372–12386 (2008).
    [Crossref] [PubMed]
  34. H. Kim, J. Hahn, and B. Lee, “Image volume analysis of omnidirectional parallax regular-polyhedron three-dimensional displays,” Opt. Express 17(8), 6389–6396 (2009).
    [Crossref] [PubMed]
  35. Y. Lim, K. Hong, H. Kim, H.-E. Kim, E.-Y. Chang, S. Lee, T. Kim, J. Nam, H.-G. Choo, J. Kim, and J. Hahn, “360-degree tabletop electronic holographic display,” Opt. Express 24(22), 24999–25009 (2016).
    [Crossref] [PubMed]
  36. T. Inoue and Y. Takaki, “Table screen 360-degree holographic display using circular viewing-zone scanning,” Opt. Express 23(5), 6533–6542 (2015).
    [Crossref] [PubMed]
  37. Y. Takaki and S. Uchida, “Table screen 360-degree three-dimensional display using a small array of high-speed projectors,” Opt. Express 20(8), 8848–8861 (2012).
    [Crossref] [PubMed]
  38. T. Kakue, T. Nishitsuji, T. Kawashima, K. Suzuki, T. Shimobaba, and T. Ito, “Aerial projection of three-dimensional motion pictures by electro-holography and parabolic mirrors,” Sci. Rep. 5(1), 11750 (2015).
    [Crossref] [PubMed]

2018 (2)

2017 (6)

2016 (2)

2015 (5)

2014 (3)

2013 (4)

2012 (4)

2011 (3)

2010 (2)

2009 (2)

2008 (2)

2006 (1)

2005 (1)

1996 (1)

Ahrenberg, L.

Benzie, P.

Blinder, D.

Cao, L.

Chang, E.-Y.

Chen, N.

Cho, J.

Choi, H.-J.

Choo, H.-G.

Dorsch, R. G.

Ferreira, C.

Hahn, J.

J. Roh, K. Kim, E. Moon, S. Kim, B. Yang, J. Hahn, and H. Kim, “Full-color holographic projection display system featuring an achromatic Fourier filter,” Opt. Express 25(13), 14774–14782 (2017).
[Crossref] [PubMed]

Y. Lim, K. Hong, H. Kim, H.-E. Kim, E.-Y. Chang, S. Lee, T. Kim, J. Nam, H.-G. Choo, J. Kim, and J. Hahn, “360-degree tabletop electronic holographic display,” Opt. Express 24(22), 24999–25009 (2016).
[Crossref] [PubMed]

D. Im, J. Cho, J. Hahn, B. Lee, and H. Kim, “Accelerated synthesis algorithm of polygon computer-generated holograms,” Opt. Express 23(3), 2863–2871 (2015).
[Crossref] [PubMed]

W. Lee, D. Im, J. Paek, J. Hahn, and H. Kim, “Semi-analytic texturing algorithm for polygon computer-generated holograms,” Opt. Express 22(25), 31180–31191 (2014).
[Crossref] [PubMed]

D. Im, E. Moon, Y. Park, D. Lee, J. Hahn, and H. Kim, “Phase-regularized polygon computer-generated holograms,” Opt. Lett. 39(12), 3642–3645 (2014).
[Crossref] [PubMed]

J. Cho, J. Hahn, and H. Kim, “Fast reconfiguration algorithm of computer generated holograms for adaptive view direction change in holographic three-dimensional display,” Opt. Express 20(27), 28282–28291 (2012).
[Crossref] [PubMed]

J. Hong, Y. Kim, H.-J. Choi, J. Hahn, J.-H. Park, H. Kim, S.-W. Min, N. Chen, and B. Lee, “Three-dimensional display technologies of recent interest: principles, status, and issues,” Appl. Opt. 50(34), H87–H115 (2011).
[Crossref] [PubMed]

H. Kim, J. Hahn, and B. Lee, “Image volume analysis of omnidirectional parallax regular-polyhedron three-dimensional displays,” Opt. Express 17(8), 6389–6396 (2009).
[Crossref] [PubMed]

J. Hahn, H. Kim, Y. Lim, G. Park, and B. Lee, “Wide viewing angle dynamic holographic stereogram with a curved array of spatial light modulators,” Opt. Express 16(16), 12372–12386 (2008).
[Crossref] [PubMed]

H. Kim, J. Hahn, and B. Lee, “Mathematical modeling of triangle-mesh-modeled three-dimensional surface objects for digital holography,” Appl. Opt. 47(19), D117–D127 (2008).
[Crossref] [PubMed]

Hong, J.

Hong, K.

Ichihashi, Y.

T. Senoh, Y. Ichihashi, R. Oi, H. Sasaki, and K. Yamamoto, “Study on a holographic TV system based on multi-view images and depth maps,” Proc. SPIE 8644, 86440A (2013).

T. Shimobaba, T. Ito, N. Masuda, Y. Ichihashi, and N. Takada, “Fast calculation of computer-generated-hologram on AMD HD5000 series GPU and OpenCL,” Opt. Express 18(10), 9955–9960 (2010).
[Crossref] [PubMed]

Ichikawa, T.

Im, D.

Inoue, T.

Ito, T.

Ji, Y.-M.

Jia, J.

Jiao, S.

Jin, G.

Kakue, T.

T. Kakue, T. Nishitsuji, T. Kawashima, K. Suzuki, T. Shimobaba, and T. Ito, “Aerial projection of three-dimensional motion pictures by electro-holography and parabolic mirrors,” Sci. Rep. 5(1), 11750 (2015).
[Crossref] [PubMed]

Kang, H.

Kawashima, T.

T. Kakue, T. Nishitsuji, T. Kawashima, K. Suzuki, T. Shimobaba, and T. Ito, “Aerial projection of three-dimensional motion pictures by electro-holography and parabolic mirrors,” Sci. Rep. 5(1), 11750 (2015).
[Crossref] [PubMed]

Kim, D.-W.

Kim, E.-S.

Kim, H.

J. Roh, K. Kim, E. Moon, S. Kim, B. Yang, J. Hahn, and H. Kim, “Full-color holographic projection display system featuring an achromatic Fourier filter,” Opt. Express 25(13), 14774–14782 (2017).
[Crossref] [PubMed]

Y. Lim, K. Hong, H. Kim, H.-E. Kim, E.-Y. Chang, S. Lee, T. Kim, J. Nam, H.-G. Choo, J. Kim, and J. Hahn, “360-degree tabletop electronic holographic display,” Opt. Express 24(22), 24999–25009 (2016).
[Crossref] [PubMed]

D. Im, J. Cho, J. Hahn, B. Lee, and H. Kim, “Accelerated synthesis algorithm of polygon computer-generated holograms,” Opt. Express 23(3), 2863–2871 (2015).
[Crossref] [PubMed]

W. Lee, D. Im, J. Paek, J. Hahn, and H. Kim, “Semi-analytic texturing algorithm for polygon computer-generated holograms,” Opt. Express 22(25), 31180–31191 (2014).
[Crossref] [PubMed]

D. Im, E. Moon, Y. Park, D. Lee, J. Hahn, and H. Kim, “Phase-regularized polygon computer-generated holograms,” Opt. Lett. 39(12), 3642–3645 (2014).
[Crossref] [PubMed]

J. Cho, J. Hahn, and H. Kim, “Fast reconfiguration algorithm of computer generated holograms for adaptive view direction change in holographic three-dimensional display,” Opt. Express 20(27), 28282–28291 (2012).
[Crossref] [PubMed]

J. Hong, Y. Kim, H.-J. Choi, J. Hahn, J.-H. Park, H. Kim, S.-W. Min, N. Chen, and B. Lee, “Three-dimensional display technologies of recent interest: principles, status, and issues,” Appl. Opt. 50(34), H87–H115 (2011).
[Crossref] [PubMed]

H. Kim, J. Hahn, and B. Lee, “Image volume analysis of omnidirectional parallax regular-polyhedron three-dimensional displays,” Opt. Express 17(8), 6389–6396 (2009).
[Crossref] [PubMed]

H. Kim, J. Hahn, and B. Lee, “Mathematical modeling of triangle-mesh-modeled three-dimensional surface objects for digital holography,” Appl. Opt. 47(19), D117–D127 (2008).
[Crossref] [PubMed]

J. Hahn, H. Kim, Y. Lim, G. Park, and B. Lee, “Wide viewing angle dynamic holographic stereogram with a curved array of spatial light modulators,” Opt. Express 16(16), 12372–12386 (2008).
[Crossref] [PubMed]

Kim, H.-E.

Kim, H.-J.

Kim, J.

Kim, J.-M.

Kim, K.

Kim, N.

Kim, S.

Kim, S.-B.

Kim, S.-C.

Kim, S.-H.

Kim, T.

Kim, Y.

Ko, S.-B.

Kong, D.

Lee, B.

Lee, D.

Lee, S.

Lee, W.

Li, B.

Li, G.

Li, X.

Lim, Y.

Liu, J.

Lohman, A. W.

Magnor, M.

Masuda, N.

Matsushima, K.

Mendlovic, D.

Min, S.-W.

Moon, E.

Nakahara, S.

Nakayama, H.

Nam, J.

Nishitsuji, T.

T. Kakue, T. Nishitsuji, T. Kawashima, K. Suzuki, T. Shimobaba, and T. Ito, “Aerial projection of three-dimensional motion pictures by electro-holography and parabolic mirrors,” Sci. Rep. 5(1), 11750 (2015).
[Crossref] [PubMed]

Oi, R.

T. Senoh, Y. Ichihashi, R. Oi, H. Sasaki, and K. Yamamoto, “Study on a holographic TV system based on multi-view images and depth maps,” Proc. SPIE 8644, 86440A (2013).

Oikawa, M.

Okada, N.

Paek, J.

Pan, Y.

Park, G.

Park, J.-H.

Park, Y.

Roh, J.

Sakamoto, Y.

Sasaki, H.

T. Senoh, Y. Ichihashi, R. Oi, H. Sasaki, and K. Yamamoto, “Study on a holographic TV system based on multi-view images and depth maps,” Proc. SPIE 8644, 86440A (2013).

Schelkens, P.

Senoh, T.

T. Senoh, Y. Ichihashi, R. Oi, H. Sasaki, and K. Yamamoto, “Study on a holographic TV system based on multi-view images and depth maps,” Proc. SPIE 8644, 86440A (2013).

Seo, Y.-H.

Shimobaba, T.

Shiraki, A.

Sonobe, N.

Stoykova, E.

Suzuki, K.

T. Kakue, T. Nishitsuji, T. Kawashima, K. Suzuki, T. Shimobaba, and T. Ito, “Aerial projection of three-dimensional motion pictures by electro-holography and parabolic mirrors,” Sci. Rep. 5(1), 11750 (2015).
[Crossref] [PubMed]

Symeonidou, A.

Takada, N.

Takaki, Y.

Tsuchiyama, Y.

Uchida, S.

Wakunami, K.

Wang, Y.

Watson, J.

Yamaguchi, K.

Yamaguchi, M.

Yamamoto, K.

T. Senoh, Y. Ichihashi, R. Oi, H. Sasaki, and K. Yamamoto, “Study on a holographic TV system based on multi-view images and depth maps,” Proc. SPIE 8644, 86440A (2013).

Yang, B.

Yeom, H.-J.

Yeom, J.

Yoneyama, T.

Yoo, J.-S.

Yoshikawa, H.

Zalevsky, Z.

Zhang, H.

Zhao, Y.

Zhuang, Z.

Zou, W.

Appl. Opt. (9)

H. Kang, E. Stoykova, and H. Yoshikawa, “Fast phase-added stereogram algorithm for generation of photorealistic 3D content,” Appl. Opt. 55(3), A135–A143 (2016).
[Crossref] [PubMed]

T. Ichikawa, K. Yamaguchi, and Y. Sakamoto, “Realistic expression for full-parallax computer-generated holograms with the ray-tracing method,” Appl. Opt. 52(1), A201–A209 (2013).
[Crossref] [PubMed]

J. Hong, Y. Kim, H.-J. Choi, J. Hahn, J.-H. Park, H. Kim, S.-W. Min, N. Chen, and B. Lee, “Three-dimensional display technologies of recent interest: principles, status, and issues,” Appl. Opt. 50(34), H87–H115 (2011).
[Crossref] [PubMed]

H. Kim, J. Hahn, and B. Lee, “Mathematical modeling of triangle-mesh-modeled three-dimensional surface objects for digital holography,” Appl. Opt. 47(19), D117–D127 (2008).
[Crossref] [PubMed]

K. Matsushima and N. Sonobe, “Full-color digitized holography for large-scale holographic 3D imaging of physical and nonphysical objects,” Appl. Opt. 57(1), A150–A156 (2018).
[Crossref] [PubMed]

Y. Pan, Y. Wang, J. Liu, X. Li, and J. Jia, “Fast polygon-based method for calculating computer-generated holograms in three-dimensional display,” Appl. Opt. 52(1), A290–A299 (2013).
[Crossref] [PubMed]

K. Matsushima, “Computer-generated holograms for electro-holography,” Appl. Opt. 44, 4607–4614 (2005).
[Crossref] [PubMed]

K. Matsushima and S. Nakahara, “Extremely high-definition full-parallax computer-generated hologram created by the polygon-based method,” Appl. Opt. 48(34), H54–H63 (2009).
[Crossref] [PubMed]

N. Takada, T. Shimobaba, H. Nakayama, A. Shiraki, N. Okada, M. Oikawa, N. Masuda, and T. Ito, “Fast high-resolution computer-generated hologram computation using multiple graphics processing unit cluster system,” Appl. Opt. 51(30), 7303–7307 (2012).
[Crossref] [PubMed]

Chin. Opt. Lett. (1)

J. Inform. Displ. (1)

J.-H. Park, “Recent progress in computer-generated holography for three-dimensional scenes,” J. Inform. Displ. 18(1), 1–12 (2017).
[Crossref]

J. Opt. Soc. Am. A (1)

Opt. Express (23)

J. Hahn, H. Kim, Y. Lim, G. Park, and B. Lee, “Wide viewing angle dynamic holographic stereogram with a curved array of spatial light modulators,” Opt. Express 16(16), 12372–12386 (2008).
[Crossref] [PubMed]

H. Kim, J. Hahn, and B. Lee, “Image volume analysis of omnidirectional parallax regular-polyhedron three-dimensional displays,” Opt. Express 17(8), 6389–6396 (2009).
[Crossref] [PubMed]

Y. Lim, K. Hong, H. Kim, H.-E. Kim, E.-Y. Chang, S. Lee, T. Kim, J. Nam, H.-G. Choo, J. Kim, and J. Hahn, “360-degree tabletop electronic holographic display,” Opt. Express 24(22), 24999–25009 (2016).
[Crossref] [PubMed]

T. Inoue and Y. Takaki, “Table screen 360-degree holographic display using circular viewing-zone scanning,” Opt. Express 23(5), 6533–6542 (2015).
[Crossref] [PubMed]

Y. Takaki and S. Uchida, “Table screen 360-degree three-dimensional display using a small array of high-speed projectors,” Opt. Express 20(8), 8848–8861 (2012).
[Crossref] [PubMed]

T. Shimobaba, T. Ito, N. Masuda, Y. Ichihashi, and N. Takada, “Fast calculation of computer-generated-hologram on AMD HD5000 series GPU and OpenCL,” Opt. Express 18(10), 9955–9960 (2010).
[Crossref] [PubMed]

Y.-H. Seo, H.-J. Choi, J.-S. Yoo, and D.-W. Kim, “Cell-based hardware architecture for full-parallel generation algorithm of digital holograms,” Opt. Express 19(9), 8750–8761 (2011).
[Crossref] [PubMed]

Y. Tsuchiyama and K. Matsushima, “Full-color large-scaled computer-generated holograms using RGB color filters,” Opt. Express 25(3), 2016–2030 (2017).
[Crossref] [PubMed]

J. Cho, J. Hahn, and H. Kim, “Fast reconfiguration algorithm of computer generated holograms for adaptive view direction change in holographic three-dimensional display,” Opt. Express 20(27), 28282–28291 (2012).
[Crossref] [PubMed]

D. Im, J. Cho, J. Hahn, B. Lee, and H. Kim, “Accelerated synthesis algorithm of polygon computer-generated holograms,” Opt. Express 23(3), 2863–2871 (2015).
[Crossref] [PubMed]

S.-B. Ko and J.-H. Park, “Speckle reduction using angular spectrum interleaving for triangular mesh based computer generated hologram,” Opt. Express 25(24), 29788–29797 (2017).
[Crossref] [PubMed]

S.-C. Kim, J.-M. Kim, and E.-S. Kim, “Effective memory reduction of the novel look-up table with one-dimensional sub-principle fringe patterns in computer-generated holograms,” Opt. Express 20(11), 12021–12034 (2012).
[Crossref] [PubMed]

J.-H. Park, S.-B. Kim, H.-J. Yeom, H.-J. Kim, H. Zhang, B. Li, Y.-M. Ji, S.-H. Kim, and S.-B. Ko, “Continuous shading and its fast update in fully analytic triangular-mesh-based computer generated hologram,” Opt. Express 23(26), 33893–33901 (2015).
[Crossref] [PubMed]

W. Lee, D. Im, J. Paek, J. Hahn, and H. Kim, “Semi-analytic texturing algorithm for polygon computer-generated holograms,” Opt. Express 22(25), 31180–31191 (2014).
[Crossref] [PubMed]

L. Ahrenberg, P. Benzie, M. Magnor, and J. Watson, “Computer generated holography using parallel commodity graphics hardware,” Opt. Express 14(17), 7636–7641 (2006).
[Crossref] [PubMed]

T. Ichikawa, T. Yoneyama, and Y. Sakamoto, “CGH calculation with the ray tracing method for the Fourier transform optical system,” Opt. Express 21(26), 32019–32031 (2013).
[Crossref] [PubMed]

K. Wakunami and M. Yamaguchi, “Calculation for computer generated hologram using ray-sampling plane,” Opt. Express 19(10), 9086–9101 (2011).
[Crossref] [PubMed]

Y. Zhao, L. Cao, H. Zhang, D. Kong, and G. Jin, “Accurate calculation of computer-generated holograms using angular-spectrum layer-oriented method,” Opt. Express 23(20), 25440–25449 (2015).
[Crossref] [PubMed]

J. Roh, K. Kim, E. Moon, S. Kim, B. Yang, J. Hahn, and H. Kim, “Full-color holographic projection display system featuring an achromatic Fourier filter,” Opt. Express 25(13), 14774–14782 (2017).
[Crossref] [PubMed]

T. Shimobaba, H. Nakayama, N. Masuda, and T. Ito, “Rapid calculation algorithm of Fresnel computer-generated-hologram using look-up table and wavefront-recording plane methods for three-dimensional display,” Opt. Express 18(19), 19504–19509 (2010).
[Crossref] [PubMed]

T. Shimobaba and T. Ito, “Fast generation of computer-generated holograms using wavelet shrinkage,” Opt. Express 25(1), 77–87 (2017).
[Crossref] [PubMed]

S. Jiao, Z. Zhuang, and W. Zou, “Fast computer generated hologram calculation with a mini look-up table incorporated with radial symmetric interpolation,” Opt. Express 25(1), 112–123 (2017).
[Crossref] [PubMed]

A. Symeonidou, D. Blinder, and P. Schelkens, “Colour computer-generated holography for point clouds utilizing the Phong illumination model,” Opt. Express 26(8), 10282–10298 (2018).
[Crossref] [PubMed]

Opt. Lett. (1)

Proc. SPIE (1)

T. Senoh, Y. Ichihashi, R. Oi, H. Sasaki, and K. Yamamoto, “Study on a holographic TV system based on multi-view images and depth maps,” Proc. SPIE 8644, 86440A (2013).

Sci. Rep. (1)

T. Kakue, T. Nishitsuji, T. Kawashima, K. Suzuki, T. Shimobaba, and T. Ito, “Aerial projection of three-dimensional motion pictures by electro-holography and parabolic mirrors,” Sci. Rep. 5(1), 11750 (2015).
[Crossref] [PubMed]

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 Convergence and accommodation in the binocular visual perception system (a) global and local coordinates and (b) adaptive global coordinate system.
Fig. 2
Fig. 2 Image transport of an eye system (a) collinear transform and (b) rotational transform in object space and retina space.
Fig. 3
Fig. 3 (a) Schematic design of the computational simulation to verify the non-linear conversion relationship between two eyes, and (b) the retina image and non-linear grid map for the left and right eyes.
Fig. 4
Fig. 4 (a) Disparity in observed texture patterns at different locations. (b) The approximately linear relationship among the adjacent observation points.
Fig. 5
Fig. 5 (a) Schematic diagram for analyzing interocular similarity and (b) the comparison of two grids calculated by the exact and approximate method.
Fig. 6
Fig. 6 Analysis results for interocular similarity: (a) two comparison grids calculated by the exact and approximate methods, (b) the RMS error graph for total area, and (c) the RMS error for the interior area of the triangle.
Fig. 7
Fig. 7 The observation image of a full-color CGH calculated using (a) the exact method and (b) the approximate method.
Fig. 8
Fig. 8 The elapsed time for calculating the CGHs using (a) the exact method and (b) the approximate method.
Fig. 9
Fig. 9 Verifying the properties of a 3D holographic image with the accommodation effect: (a) when focusing on the cube and (b) when focusing on the checker board.

Equations (55)

Equations on this page are rendered with MathJax. Learn more.

N( x 0 y 0 z 0 )=( x c y c z c )+( cosϕsinθ sinϕsinθ cosθ )t,
t= ( x 0 x c ) 2 + ( y 0 y c ) 2 + ( z 0 z c ) 2 ,
( cosθ,sinθ )=( ( z 0 z c )/t , ( x 0 x c ) 2 + ( y 0 y c ) 2 /t ),
( cosϕ,sinϕ )=( ( x 0 x c )/ ( tsinθ ) , ( y 0 y c )/ ( tsinθ ) ).
v=( cosϕcosθsinτsinϕcosτ,sinϕcosθsinτ+cosϕcosτ,sinθsinτ ),
f= d e t/ ( d e +t ) ,
( x y z )=( r 11 r 12 r 13 r 21 r 22 r 23 r 31 r 32 r 33 )( x x 0 y y 0 z z 0 )+( x 0 y 0 z 0 ),
( r 11 r 12 r 13 r 21 r 22 r 23 r 31 r 32 r 33 )=( cosτ sinτ 0 sinτ cosτ 0 0 0 1 )( cosθ 0 sinθ 0 1 0 sinθ 0 cosθ )( cosϕ sinϕ 0 sinϕ cosϕ 0 0 0 1 ),
( x 0 , y 0 , z 0 )=( 0,0, x 0 2 + y 0 2 + z 0 2 ).
1/f =1/ ( d 1 z c ) +1/ d 2 .
( u,v,w )=( D 2 x/ D 1 , D 2 y/ D 1 , D 2 d 2 ).
cosϕsinθ( x x c )+sinϕsinθ( y y c )+cosθ( z z c )=0,
cos ϕ r sin θ r ( u u c )+sin ϕ r sin θ r ( v v c )+cos θ r ( w w c )=0,
( x y z )=( 0u 0v d 1 ( d 1 + d 2 +w ) )t+( u v d 1 + d 2 +w ),
t= ( x c u )cosϕsinθ+( y c v )sinϕsinθ+( z c w d 1 d 2 )cosθ x r cosϕsinθ y r sinϕsinθ+[ d 1 ( d 1 + d 2 + z r ) ]cosθ .
( u v w )= 1 ( 1t ) ( x y z d 1 )+( 0 0 d 2 )=s( x y z d 1 )+( 0 0 d 2 ),
s= u c cos ϕ r sin θ r + v c sin ϕ r sin θ r +( d 2 + w c )cos θ r cos ϕ r sin θ r x+sin ϕ r sin θ r y+( z d 1 )cos θ r .
( x' y' z' )=( cosθcosϕ cosθsinϕ sinθ sinϕ cosϕ 0 sinθcosϕ sinθsinϕ cosθ )( x x c y y c z z c ).
( u' v' w' )=( cos θ r cos ϕ r cos θ r sin ϕ r sin θ r sin ϕ r cos ϕ r 0 sin θ r cos ϕ r sin θ r sin ϕ r cos θ r )( u u c v v c w w c ).
( x' y' z' )=( cosθcosϕ cosθsinϕ sinθ sinϕ cosϕ 0 sinθcosϕ sinθsinϕ cosθ )[ ( 0u 0v d 1 ( d 1 + d 2 +w ) )t+( u x c v y c d 1 + d 2 +w z c ) ],
( u v w )=( cos ϕ r cos θ r sin θ r cos ϕ r sin θ r sin ϕ r cos θ r cos ϕ r sin ϕ r sin θ r sin θ r 0 cos θ r )( u v w )+( u c v c w c ).
( u v w )=( cos θ r cos ϕ r cos θ r sin ϕ r sin θ r sin ϕ r cos ϕ r 0 sin θ r cos ϕ r sin θ r sin ϕ r cos θ r )( u u c v v c w w c ),
( u v w )=s{ ( cos θ r cos ϕ r cos θ r sin ϕ r sin θ r sin ϕ r cos ϕ r 0 sin θ r cos ϕ r sin θ r sin ϕ r cos θ r ) 1 ( x' y' z' )+( x c y c z c ) }+( 0 0 d 2 s d 1 ).
( x n y n z n )=Gto L n RLtoG( x' y' z' ).
( a 11 a 12 a 21 a 22 )( u ' ref,1 v ' ref,1 )+( b 1 b 2 )=( u ' adj,1 v ' adj,1 ),
( a 11 a 12 a 21 a 22 )( u ' ref,2 v ' ref,2 )+( b 1 b 2 )=( u ' adj,2 v ' adj,2 ),
( a 11 a 12 a 21 a 22 )( u ' ref,3 v ' ref,3 )+( b 1 b 2 )=( u ' adj,3 v ' adj,3 ).
( u ' ref,1 v ' ref,1 0 0 1 0 0 0 u ' ref,1 v ' ref,1 0 1 u ' ref,2 v ' ref,2 0 0 1 0 0 0 u ' ref,2 v ' ref,2 0 1 u ' ref,3 v ' ref,3 0 0 1 0 0 0 u ' ref,3 v ' ref,3 0 1 )( a 11 a 12 a 21 a 22 b 1 b 2 )=( u ' adj,1 v ' adj,1 u ' adj,2 v ' adj,2 u ' adj,3 v ' adj,3 ),
( a 11 a 12 a 21 a 22 )( u ' ref v ' ref )+( b 1 b 2 )=( u ' adj v ' adj ).
F( u ' ref ,y ' ref )= A ref ( α ' ref ,β ' ref )exp [ j2π( α ' ref u ' ref +β ' ref v ' ref ) ]dα ' ref dβ ' ref ,
G( u ' adj ,v ' adj )= A adj ( α ' adj ,β ' adj )exp [ j2π( α ' adj u ' adj +β ' adj v ' adj ) ]dα ' adj dβ ' adj .
( u ' adj v ' adj )=( a 11 a 12 a 21 a 22 )( u ' ref v ' ref )+( b 1 b 2 ).
G( a 11 u ' ref + a 12 v ' ref + b 1 , a 21 u ' ref + a 22 v ' ref + b 2 ) = A adj ( α ' adj ,β ' adj ) ×exp{ j2π[ α ' adj ( a 11 u ' ref + a 12 v ' ref + b 1 )+β ' adj ( a 21 u ' ref + a 22 v ' ref + b 2 ) ] }dα ' adj dβ ' adj = { A adj ( α ' adj ,β ' adj )exp[ j2π( α ' adj b 1 +β ' adj b 2 ) ] } ×exp{ j2π[ ( a 11 α ' adj + a 21 β ' adj )u ' ref +( a 12 α ' adj + a 22 β ' adj )v ' ref ] }dα ' adj dβ ' adj .
( α ' ref β ' ref )=( a 11 α ' adj + a 21 β ' adj a 12 α ' adj + a 22 β ' adj ),
F( u ref , v ref ) = A ref ( α ' ref ,β ' ref )exp[ j2π( α ' ref u ' ref +β ' ref v ' ref ) ]dα ' ref dβ ' ref = A ref ( α ' ref ,β ' ref )exp[ j2π( α ' ref u ' ref +β ' ref v ' ref ) ]( a 11 a 22 a 12 a 21 )dα ' adj dβ ' adj =G( a 11 u ' ref + a 12 v ' ref + b 1 , a 21 u ' ref + a 22 v ' ref + b 2 ) = A adj ( α ' adj ,β ' adj )exp[ j2π( α ' adj b 1 +β ' adj b 2 ) ]exp[ j2π( α ' ref u ' ref +β ' ref v ' ref ) ]dα ' adj dβ ' adj .
A adj ( α ' adj ,β ' adj ) =exp[ j2π( α ' adj b 1 +β ' adj b 2 ) ]( a 11 a 22 a 12 a 21 ) A ref ( α ' ref ,β ' ref ) =exp[ j2π( α ' adj b 1 +β ' adj b 2 ) ]( a 11 a 22 a 12 a 21 ) A ref ( a 11 α ' adj + a 21 β ' adj , a 12 α ' adj + a 22 β ' adj ).
G( u ' adj ,v ' adj ) = exp[ j2π( α ' adj b 1 +β ' adj b 2 ) ]( a 11 a 22 a 12 a 21 ) A ref ( a 11 α ' adj + a 21 β ' adj , a 12 α ' adj + a 22 β ' adj ) ×exp[ j2π( α ' adj u ' adj +β ' adj v ' adj ) ]dα ' adj dβ ' adj = A adj ( α ' adj ,β ' adj )exp[ j2π( α ' adj u ' adj +β ' adj v ' adj ) ]dα ' adj dβ ' adj .
( u adj v adj w adj )=( cos θ adj 0 sin θ adj 0 1 0 sin θ adj 0 cos θ adj )×( cos ϕ adj sin ϕ adj 0 sin ϕ adj cos ϕ adj 0 0 0 1 )( u adj u adj,c v adj v adj,c w adj w adj,c ) =( cos θ adj cos ϕ adj cos θ adj sin ϕ adj sin θ adj sin ϕ adj cos ϕ adj 0 sin θ adj cos ϕ adj sin θ adj sin ϕ adj cos θ adj )( u adj u adj,c v adj v adj,c w adj w adj,c ).
W( u adj , v adj ,0 ) = η 0 exp{ j2π[ α 0 ( u adj + u adj,c )+ β 0 ( u adj + u adj,c )+ γ 0 ' w adj,c ] } × A adj@L ( α adj , β adj )exp[ j2π( α adj u adj + β adj v adj ) ]d α adj d β adj = η 0 exp[ j2π( α 0 u adj,c + β 0 u adj,c + γ 0 ' w adj,c ) ] × A adj@L ( α adj α 0 , β adj β 0 )exp[ j2π( α adj u adj + β adj v adj ) ]d α adj d β adj .
W( u adj , v adj , w adj ) = η 0 exp[ j2π( α 0 u adj,c + β 0 v adj,c + γ 0 w adj,c ) ] × A adj@L ( α adj α 0 , β adj β 0 )exp[ j2π( α adj u adj + β adj v adj + γ adj w adj ) ]d α adj d β adj .
α adj ( α adj , β adj )=cos θ adj cos ϕ adj α adj +cos θ adj sin ϕ adj β adj sin θ adj γ adj ,
β adj ( α adj , β adj )=sin ϕ adj α adj +cos ϕ adj β adj ,
γ adj ( α adj , β adj )=sin θ adj cos ϕ adj α adj +sin θ adj sin ϕ adj β adj +cos θ adj γ adj .
d α adj ( α adj , β adj )d β adj ( α adj , β adj ) =| J |d α adj d β adj =| cos θ adj + sin θ adj ( α adj cos ϕ adj + β adj sin ϕ adj )/ γ adj |d α adj d β adj .
W( u adj , v adj , w adj ) = η 0 exp[ j2π( α 0 u adj,c + β 0 v adj,c + γ 0 w adj,c ) ] × A adj@L ( α adj ( α adj , β adj ) α 0 ( α 0 , β 0 ), β adj ( α adj , β adj ) β 0 ( α 0 , β 0 ) ) H( γ adj ( α adj , β adj ) ) ×exp{ j2π[ α adj ( u adj u adj,c )+ β adj ( v adj v adj,c )+ γ adj ( w adj w adj,c ) ] } ×| cos θ adj + sin θ adj ( α adj cos ϕ adj + β adj sin ϕ adj )/ γ adj |d α adj d β adj .
A adj@G ( α adj , β adj ) = η 0 exp[ j2π( α 0 u adj,c + β 0 v adj,c + γ 0 w adj,c ) ] × A adj@L ( α ' adj ( α adj , β adj )α ' 0 ( α 0 , β 0 ),β ' adj ( α adj, β adj )β ' 0 ( α o , β 0 ) ) ×H( γ ' adj ( α adj , β adj ) )exp{ j2π[ α adj ( u adj,c )+ β adj ( v adj,c )+ γ adj ( w adj,c ) ] } ×| cos θ adj + sin θ adj ( α adj cos ϕ adj + β adj sin ϕ adj )/ γ adj |.
A adj@L ( α ' adj ( α adj , β adj )α ' 0 ( α 0 , β 0 ),β ' adj ( α adj, β adj )β ' 0 ( α o , β 0 ) ) = A adj@L ( α' ' adj ,β' ' adj ) =exp[ j2π( α' ' adj b 1 +β' ' adj b 2 ) ]( a 11 a 22 a 12 a 21 ) × A ref ( a 11 α' ' adj + a 21 β' ' adj , a 12 α' ' adj + a 22 β' ' adj ),
α' ' adj =α ' adj ( α adj , β adj )α ' 0 ( α 0 , β 0 ),
β' ' adj =β ' adj ( α adj, β adj )β ' 0 ( α o , β 0 ).
A adj@G ( α adj , β adj ) = η 0 exp[ j2π( α 0 u adj,c + β 0 v adj,c + γ 0 w adj,c ) ]H( γ ' adj ( α adj , β adj ) ) ×exp{ j2π[ α adj ( u adj,c )+ β adj ( v adj,c )+ γ adj ( w adj,c ) ] } ×| cos θ r + sin θ r ( α r cos ϕ r + β r sin ϕ r )/ γ r |exp[ j2π( α' ' adj b 1 +β' ' adj b 2 ) ] ×( a 11 a 22 a 12 a 21 ) A ref ( a 11 α' ' adj + a 21 β' ' adj , a 12 α' ' adj + a 22 β' ' adj ).
( x n y n z n )=( cos θ n cos ϕ n cos θ n sin ϕ n sin θ n sin ϕ n cos ϕ n 0 sin θ n cos ϕ n sin θ n sin ϕ n cos θ n )( x n x nc y n y nc z n z nc )=Gto L n ( x n x nc y n y nc z n z nc ),
( x n y n z n )=Gto L n [ ( x n x n0 y n y n0 z n z n0 )+( x n0 x nc y n0 y nc z n0 z nc ) ],
( x n y n z n )=Gto L n [ R( x x 0 y y 0 z z 0 )+( x n0 x nc y n0 y nc z n0 z nc ) ].
( x x c y y c z z c )=( cosθcosϕ sinϕ cosϕsinθ sinϕcosθ cosϕ sinϕsinθ sinθ 0 cosθ )( x' y' z' )=LtoG( x' y' z' ).
( x n y n z n )=Gto L n { R[ LtoG( x' y' z' )+( x c x 0 y c y 0 z c z 0 ) ]+( x n0 x nc y n0 y nc z n0 z nc ) } =Gto L n RLtoG( x' y' z' )+Gto L n [ R( x c x 0 y c y 0 z c z 0 )+( x n0 x nc y n0 y nc z n0 z nc ) ] =Gto L n RLtoG( x' y' z' ),

Metrics