Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Experimental three-dimensional optical image reconstruction of heterogeneous turbid media from continuous-wave data

Open Access Open Access

Abstract

This paper demonstrates experimental three-dimensional (3D) image reconstruction of optically heterogeneous turbid media from near-infrared continuous-wave measurements. Successful reconstruction is achieved through a full 3D finite-element based, Newton-type reconstruction algorithm. Our experimental evidence shows that both absorption and scattering images of a 10×15 mm cylindrical object embedded in a 50×50 mm cylindrical background can be reconstructed using our algorithm.

©2000 Optical Society of America

1. Introduction

Reconstruction-based optical diffusion tomography offers clinical potentials for applications such as breast cancer detection and brain functional imaging. This is primarily due to its capability to form an image of the spatial distribution of absorption and scattering properties of tissue for providing both structural and functional information [1]. In this type of indirect image method, an effective reconstruction algorithm is crucial. While various linear algorithms have been developed [24], increasingly attention is turning to iterative, nonlinear reconstruction methods [58] which can overcome the systemic errors due to the linear approximation [9].

To date nonlinear methods have been limited to two-dimensional (2D) problems experimentally [58]. While quantitative 2D images have been achieved using tissue phantoms in the laboratories [5,6], “infinite” long cylinders have been used as the objects in these experiments. Since these 2D schemes and experiments have ignored the essential 3D nature of light scattering in tissue, some degrees of distortion of the object shape and size and degradation of the absorption and scattering coefficients recovered would be anticipated when the object has a finite volume such as a tumor in a human breast. This necessitates our consideration of optical image reconstruction in a 3D framework. 3D optical image reconstructions have been primarily limited to simulations [1012]. 3D experimental work from time-resolved data began to appear very recently [13]; however, the data collection used relied on expensive time-domain system. In this paper, we report for the first time an experimental confirmation for full 3D reconstruction using continuous-wave or dc data. Compared with time-domain based method, continuous-wave method uses the simplest, most economical steady-state, continuous-wave optical components. In this optical system, we only need to measure the diffusive light intensity for image reconstruction without any expensive detection techniques and complicated data acquisition software. Due to this simplicity of continuous-wave optical system and high signal-to-noise ratio available from time-independent light intensity measurements, imaging using intensity data only may prove to be considerably easier to implement clinically and to be more robust with respect to the reliability of the data collected in the clinical environment relative to its time-domain counterpart. Both absorption and scattering images are obtained using absolute dc data without invoking a calibration measurement on a homogeneous phantom.

2. Reconstruction algorithm

Our 3D nonlinear reconstruction algorithm is based on powerful finite element methods. The algorithm uses a regularized Newton method to update an initial (guess) optical property distribution iteratively in order to minimize an object function composed of a weighted sum of the squared difference between computed and measured data. While the details of the algorithm can be found in Ref. 10, here we present a brief overview for context.

Given a time-independent incident beam, light propagation in tissue can be described by the well-known diffusion equation [1]:

·D(X,Y,Z)Φ(X,Y,Z)μa(X,Y,Z)Φ(X,Y,Z)=S(X,Y,Z)

where Φ is the photon density, D is the diffusion coefficient, µa is the absorption coefficient, and S is the source term. The diffusion coefficient D can be written as D=1/3[µsa], where µs is the reduced scattering coefficient (since µs>>µa in turbid medium, we work directly with D and µa during reconstruction). In this study, a point source, S=Soδ(x-xo,y-yo,z-zo), and Type III boundary conditions (BCs), -D∇Φ·n^,=αΦ, are used where So expresses the strength of the source, and α is related to the internal reflection at the boundary.

Making use of a finite element discretization, we can obtain a discrete matrix equation for (1) and realize other derived matrix relations (through differentiation) which lead to a set of equations capable of inverse problem solution [5,6,14]

[A]{Φ}={b}
[A]{Φχ}={bχ}[Aχ]{Φ}
(T+λI)Δχ=T(ΦoΦc)

where the elements of matrix [A] are aij=<-D∇ϕj·∇ϕiaϕjϕi> with < > symbolizing integration over the 3D problem domain; ϕi, ϕj are locally spatially-varying Lagrangian basis function; χ expresses D or µa ; ℑ is the Jacobian matrix which should be formed from ∂Φ/∂χ at the boundary measurement sites; Δ χ =(ΔD1,ΔD2,…ΔDN,Δµa,1,Δµa,2,…Δµa,N)T is the update vector for the optical property profiles where N is the total number of nodes in the finite element mesh used; Φo=(Φ1o,Φ2o,…ΦMo)T and Φc=(Φ1c,Φ2c,…ΦMc)T, where Φio and Φic are observed and calculated data for i=1, 2, … M boundary locations. Note that in order to estimate D and µa spatially, we expand these quantities in a similar fashion to Φ as a finite sum of unknown coefficients multiplied by the locally defined Lagrangian basis functions. In optical imaging, D and µa are unknown and Φ is also generally not known except possibly at a finite number of measurement sites. The basic idea for determining D and µa is to make measurements of diffusive light around the perimeter of the problem domain for a set of known optical excitation positions. Then the image formation task is to make estimates (which are updated and improved through solution of (4)) of the D and µa distributions that are required to sustain the measured boundary quantities. Since the matrix T is known to be ill-conditioned,5–8 a way to regularize or stabilize the decomposition of T is needed. We have used a hybrid technique in this study which synthesizes standard Marquardt and Tikhonov regularization schemes.

 figure: Fig. 1.

Fig. 1. Experimental geometry under study, with an off-centered object close to the x coordinate. The source/detector locations are also shown (right). All the dimensional units are in millimeter.

Download Full Size | PDF

3. Reconstruction of 3D absorption and scattering images

Our experimental setup used was an automated multi-channel frequency-domain system that was described in detail elsewhere [15]. Briefly, an intensity-modulated light from a 785-nm 50 mW diode laser was sequentially sent to the phantom by 16 3-mm fiber optic bundles. For each source position, the diffused light was received at 4×16 detector positions along the surface of the cylindrical phantom (see Fig. 1 for the source/detector arrangement). dc, ac and phase shift signals were obtained using the standard heterodyne technique controlled by FFT Labview routines. For each experimental configuration, a total of 3×1024 measurements (1024 dc, ac and phases each) were made which needed about 32 minutes at the present time. We just applied measured dc data to reconstruct the absorption and scattering images in this study. We used a 50×50 mm cylindrical solid phantom (Intralipid+India ink+Agar) as the background medium, making the absorption coefficient 0.005/mm µa=and the reduced scattering coefficient µs1.0/mm. An off-centered 10×15 mm cylindrical solid object was embedded in the background medium (see Fig. 1). Two cases with different optical contrast between the object and background were tested. The object in the first case was just an absorber with µa=0.025/mm and µs=1.0/mm. The center of this object was located at (13, 0, 0). Both µa and µs for the object in the second case were different from the background: µa=0.025/mm and µs=3.0/mm. The center of the second object was located at (13, -4, 0). The 3D finite-element mesh used had 3341 nodes and 16128 tetrahedral elements for both forward and inverse solutions. The initial estimate of the optical properties used for the reconstruction is a homogeneous distribution that is 20% off the exact background value. The images reported were the results of 5 iterations with about 3 hours per iteration for reconstructing µa only (µs of the object has been assumed known in the reconstruction for the pure absorber case) and 10 hours per iteration for reconstructing both µa and µs in a 600 MHz Pentium III PC.

 figure: Fig. 2.

Fig. 2. Reconstructed µa images at different cut-planes for the first test case. Refer to Fig. 1 for the definition of the cut-planes.

Download Full Size | PDF

 figure: Fig. 3.

Fig. 3. Reconstructed µa images at different cut-planes for the second test case. Refer to Fig. 1 for the definition of the cut-planes.

Download Full Size | PDF

 figure: Fig. 4.

Fig. 4. Reconstructed µs images at different cut-planes for the second test case. Refer to Fig. 1 for the definition of the cut-planes. Note that a small boundary region at the bottom side was cut off for the images shown in (c) and (d) in order to highlight the recovered object region.

Download Full Size | PDF

Fig. 2 shows the reconstructed image for the pure absorbing object at different cut-planes (see Fig. 1 for the reference coordinate system). The pink-colored region (at 3 o’clock for Fig. 2(a,b) and at the middle right and middle for Fig. 2 (c) and (d) respectively) clearly indicates the location and size of the object. Table 1 below provides the quantitative information about the center and the Full-Width at Half Maximum (FWHM) of the object as well as the recovered average optical properties of the object for the images shown in Fig. 2 and Figs. 34. While we noticed that the recovered image is quantitatively approximate in terms of the location, size and absorption coefficient of the object, we also noted that some artifacts obviously appear in the background region. These artifacts generally appear near the sources and detectors where the measurement sensitivity is highest. We believe that noise effect was the primary cause for these artifacts, which was magnified by the inherent ill-conditioned nature of the inverse problem we are dealing with. These artifacts could be minimized by controlling the measurement noise and by using improved regularization schemes such as spatially variant regularization [6]. Simultaneously reconstructed µa and µs images for the second case at multiple cut-planes are shown in Figs. 3 and 4, respectively. Again, these recovered images are not only quantitatively approximate with respect to the location and size of the object, but also quantitatively approximate in terms of the optical properties of the object (see Table 1). It is interesting to note that there is almost no artifact in the µa image for this case, while some artifacts are clearly shown in the µs image (even stronger artifacts, which are not shown here in Fig. 4 (c) and (d), appeared at the bottom boundary of the cutplanes x=4 and y=15 mm). It is further noted that the recovered µs value of the object is far lower than the exact value (about 1.4/mm vs. 3.0/mm). It seems that the quality of µa image is improved with the sacrifice of the µs image quality. Although the quality of µs image is not as good as that of the µa image, we have provided experimental evidence that both µa and µs images can be reconstructed simultaneously using dc data only.

Tables Icon

Table 1:. Recovered Geometric Information and Optical Properties of the Object

X, Y and Z stand for the x, y and z coordinates of the object centers, respectively. Lx, Ly and Lz are the FWHM of the object along the x, y and z directions, respetivesly.

4. Conclusions

In summary, we have experimentally demonstrated that quantitative reconstruction of 3D absorption and scattering images can be achieved using dc data. It is important to note that the 3D images reconstructed in this paper have been realized from absolute measured data without invoking any calibration/reference measurement.

Acknowledgements

This research was supported in part by grants from the National Institutes of Health (NIH) (CA 78334), the Department of Defense (DOD) (BC 980050), and the Greenville Hospital System/Clemson University Biomedical Cooperative.

References and links

1. A. Yodh and B. Chance, “Spectroscopy and imgaing with diffusing light,” Phys. Today 48, 3440(1995). [CrossRef]  

2. M. O’Leary, D. Boas, B. Chance, and A. Yodh, “Experimental images of heterogeneous turbid media by frequency-domain diffusing-photon tomography,” Opt. Lett. 20, 426–428(1995). [CrossRef]  

3. W. Cai, S. Gayen, M. Xu, M. Zevallos, M. Alrubaiee, M. Lax, and R. Alfano, “Optical tomographic image reconstruction from ultrafast time-sliced transmission measurements,” Appl. Opt. 38, 4237–4246(1999) [CrossRef]  

4. S. Colak, D. Papaioannou, G. tHooft, M. vander Mark, H. Schomberg, J. Paasschens, J. Melissen, and N. van Asten, “Tomographic image reconstruction from optical projections in light diffusing media,” Appl. Opt. 36, 180–213(1997). [CrossRef]   [PubMed]  

5. H. Jiang, K. Paulsen, U. Osterberg, B. Pogue, and M. Patterson, “Simultaneous reconstruction of absorption and scattering maps in turbid media from near-infrared frequency-domain data,” Opt. Lett. 20, 2128–2130(1995). [CrossRef]   [PubMed]  

6. B. Pogue, T. McBride, J. Prewitt, U. Osterberg, and K. Paulsen, “Spatially variant regularization improves diffuse optical tomography,” Appl. Opt. 38, 2950–2961(1999). [CrossRef]  

7. J. Hebden, F. Schmidt, M. Fry, M. Schweiger, E. Hillman, D. Delpy, and S. Arridge, “Simulataneous reconstruction of absorption and scattering images by multichannel measurement of purely temporal data,” Opt. Lett. 24, 534–536(1999). [CrossRef]  

8. H. Graber, J. Chang, J. Lubowsky, R. Aronson, and R. Barbour, “Near infrared absorption imaging of dense scattering media by stead-state diffusion tomography,” Proc. SPIE 1888, 372–386(1993). [CrossRef]  

9. D. Boas, “A fundamental limitation of linearized algorithms for diffuse optical tomography,” Opt. Express 1, 404–413(1997). [CrossRef]   [PubMed]  

10. H. Jiang, “Three-dimensional optical image reconstruction: Finite element approach,” Proc. of Advances in Optical Imaging and Photon Migration, Optical Society of America, 156–158(1998).

11. M. Schweiger and S. Arridge, “Comparison of two- and three-dimensional reconstruction methods in optical tomography,” Appl. Opt. 37, 7419–7428(1998). [CrossRef]  

12. M. Eppstein, D. Dougherty, D. Hawrysz, and E. Sevick-Muraca, “Three-dimensional optical tomography,” Proc. SPIE 3597, 97–105(1999). [CrossRef]  

13. S. Arridge, “A method for three-dimensional time-resolved optical tomography,” Int. J. Imaging Syst. Technol. 11, 2–11(2000). [CrossRef]  

14. H. Jiang, K. Paulsen, U. Osterberg, and M. Patterson, “Improved continuous light diffusion imaging in single-and multi-target tissue-like phantoms,” Phys. Med. Biol. 43, 675–693(1998). [CrossRef]   [PubMed]  

15. N. Iftimia and H. Jiang, Appl. Opt. (in press).

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (4)

Fig. 1.
Fig. 1. Experimental geometry under study, with an off-centered object close to the x coordinate. The source/detector locations are also shown (right). All the dimensional units are in millimeter.
Fig. 2.
Fig. 2. Reconstructed µa images at different cut-planes for the first test case. Refer to Fig. 1 for the definition of the cut-planes.
Fig. 3.
Fig. 3. Reconstructed µa images at different cut-planes for the second test case. Refer to Fig. 1 for the definition of the cut-planes.
Fig. 4.
Fig. 4. Reconstructed µs images at different cut-planes for the second test case. Refer to Fig. 1 for the definition of the cut-planes. Note that a small boundary region at the bottom side was cut off for the images shown in (c) and (d) in order to highlight the recovered object region.

Tables (1)

Tables Icon

Table 1: Recovered Geometric Information and Optical Properties of the Object

Equations (4)

Equations on this page are rendered with MathJax. Learn more.

· D ( X , Y , Z ) Φ ( X , Y , Z ) μ a ( X , Y , Z ) Φ ( X , Y , Z ) = S ( X , Y , Z )
[ A ] { Φ } = { b }
[ A ] { Φ χ } = { b χ } [ A χ ] { Φ }
( T + λ I ) Δ χ = T ( Φ o Φ c )
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.