Abstract

Near infrared optical tomography (NIROT) is an emerging modality that enables imaging the oxygenation of tissue, which is a biomarker of tremendous clinical relevance. Measuring in reflectance is usually required when NIROT is applied in clinical scenarios. Single photon avalanche diode (SPAD) array technology provides a compact solution for time domain (TD) NIROT to gain huge temporal and spatial information. This makes it possible to image complex structures in tissue. The main aim of this paper is to validate the wavelength normalization method for our new TD NIROT experimentally by exposing it to a particularly difficult challenge: the recovery of two inclusions at different depths. The proposed reconstruction algorithm aims to tackle systematic errors and other artifacts with known wavelength-dependent relation. We validated the device and reconstruction method experimentally on a silicone phantom with two inclusions: one at depth of 10 mm and the other at 15 mm. Despite this tough challenge for reflectance NIROT, the system was able to localize both inclusions accurately.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Imaging the body with harmless near infrared optical tomography (NIROT, also called diffuse optical tomography) is appealing to medical professionals and patients. Due to its intrinsic sensitivity to tissue oxygen saturation, it has potential applications for diagnosis of ischemic strokes, hypoxic tumors and early detection of neonatal brain injury [13]. A typical NIROT device consists of sources emitting light and detectors capturing the backscattered photons at some distances. Temporal and spatial information of light is later utilized for image reconstruction [4]. In general, achieving both high resolution and accuracy in image reconstruction is challenging. Researchers have been investigating methods to improve image quality by increasing the variety of measurements, for example, with denser arrangement of sources and detectors [5]. However, such systems so far have been usually bulky and require many optical fibers which limits their usability in a clinical environment. One way is to enrich the information is by employing the temporal information of light [6]. Such systems are called time domain (TD) NIROT. It has been proven that TD data with proper signal to noise ratio improves the depth sensitivity and resolution [7][8]. The recently developed TD NIROT Pioneer system utilizes a 32$\times$32 array of single-photon avalanche diodes (SPADs) measuring time of flight (TOF) for 12.5 ns with time resolution 116 ps [9]. It is equipped with 11 sources of pico-second ultra-fast super-continuum laser radiation. Pioneer provides $2.88 \times 10^6$ measurement points for a single wavelength [10]. Such an approach enables an unprecedented amount of data, which will lead to more accurate images [11]. Despite the advantages of the TD NIROT, the calibration process is non-trivial for a SPAD camera based imaging system. NIROT devices are usually calibrated on a homogeneous phantom [12]. However, measurement artifacts occur in TD data even for a system of which the instrumental response function (IRF) is well calibrated on phantoms [13]. For example, when the imaging device is applied on a surface with different curvatures, the varying distances cause different time offsets in the TD data. The time offsets alter the previously calibrated IRFs and the use of inaccurate IRFs will deteriorate the reconstructed image quality. These artifacts must be eliminated in a real case scenario and therefore a robust calibration algorithm has to be developed. Previously, relative images with different source positions were used as the calibrated data for a fiber and charge-coupled device (CCD) camera based system to eliminate the artifacts from the imperfect calibration [14]. Nevertheless, it demands to know the accurate relative power of each fiber optic source and relative offsets in time delay caused by, e.g. different fiber lengths, for a SPAD system.

We have previously proposed an auto-calibration method called wavelength normalization (WN) for region-based image reconstruction [13]. It utilized ratios of Fourier transformed TD data at different wavelengths without tedious calibration in advance. This type of calibration also has been studied for the recovery of the optical properties in near infrared spectroscopy (NIRS) [15]. This way of calibration naturally compresses the signal from original data volume for two wavelengths to one. Fortunately, the data volume for solving the inverse problem for both a region-based NIROT and NIRS is not demanding. Because there are only two unknown (absorption $\mu _a$ and scattering property $\mu _s$ ) for NIRS and a few values dependent on the number of segments for the region-based NIROT. While reconstruction methods have been established for TD NIRS [15] and structure-guided TD NIROT [13], 3D image reconstruction for NIROT without any prior has not been reported. This is a non-trivial problem because the number of unknowns is equivalent to the number of quantization element, which can be thousannds and taking the ratios lead to a more ill-posed problem. In this paper, we designed an image reconstruction method based on WN for the TD NIROT Pioneer. The aim was to validate the method experimentally by exposing it to a particularly difficult challenge: recovery of two inclusions at different depths from measured reflectance [16].

2. Materials and methods

2.1 System description

We have recently developed a TD NIROT system named Pioneer and the schematic of the system is displayed in Fig. 1(a). In this Pioneer system, pico-second light pulses of a broad spectral range are generated by a super-continuum laser SuperK Extreme EXR-15 (NKT, Denmark) at the repetition rate of 80 MHz. Wavelengths in the near infrared range are selected with an acousto-optic tunable filter (AOTF) which enables multispectral measurements. A splitter guides 1% of the output light to a power meter for laser stability control and the remaining 99% of the output light enters an optical switch to enable multiple source locations. To make the system friendly for clinical applications, we designed a tube-like probe of inner diameter $\sim$2.5 cm (Fig. 1(b)). The interface of the tube to tissue was a ring-shaped structure (schematics illustrated in Fig. 1(a)). 11 metallic ferrules for holding 11 multi-mode 62.5 $\mu m$ core fibers (Thorlabs, US) were fixed in the rigid ring evenly at a distance of 1 cm around the tube. To ensure a comfortable touch, the rigid structure was coated with soft black biocompatible silicone. Transparent silicone windows were inserted at each fiber location to direct the light to the tissue. The detailed manufacturing process of the ring is described in [17]. A 32$\times$32 SPAD camera is placed on the other side of the tube. It features a low dark count rate (DCR) of 113 counts/s and a wide spectral range and high photon detection probability (PDP) of 12$\%$ at 800 nm. The sensor includes integrated timing electronics in the form of 128 50ps time-to-digital converters (TDCs) [9], such that the arrival time of each photon at any pixel is measured with high time resolution. The maximum number of photon throughout is 465 million counts per second for the whole sensor and the dead time between events is 1.2 ns [18]. An IR-corrected lens SY110M 1.68mm F1.8 (Theia Technologies, US) projects the circular field of view (FOV) defined by the inner boundary of the tube on the SPAD array. Eventually, the Pioneer system will provide > 10k source-detector pairs for each measurement. We developed a software, based on Qt with Cpp, to control the hardware and the data acquisition. The data processing was performed in Matlab. More technical details of the system are described in [10].

 

Fig. 1. (a) Schematics of Pioneer system; and (b) picture of the Pioneer probe placed on a cylindrical silicone phantom.

Download Full Size | PPT Slide | PDF

2.2 Image reconstruction

The TD data or the enormous set of histograms obtained with the Pioneer device for the source-detector pairs are utilized for image reconstruction, i.e., the inverse problem. It relies on an accurate description of light propagation in tissue, which is called the forward problem. The solution to the forward problem provides the survival photon density and timing at each detector. The survival photon density distribution $\phi (r,t)$ at position $r$ and time point $t$ is discribed by the time-dependent diffusion equation (DE) as below [19]:

$$[-\nabla\cdot \kappa(r)\nabla+\mu_a(r)+\frac{1}{c_0(r)}\frac{\partial}{\partial t}]\phi(r,t)=q(r,t),$$
with a boundary condition of:
$$\phi(r,t) + 2\zeta(c_0)\kappa(r)\frac{\partial\phi(r,t)}{\partial\nu}=0, \quad r \in \partial \Omega,$$
where $q$ is a source term and in TD NIROT it is a pulsed laser source. The model parameters include the absorption coefficient $\mu _a$, diffusion coefficient $\kappa (r) = [3(\mu _a+\mu _s')]^{-1}$, with reduced scattering coefficient $\mu _s'$ and speed of light $c_0(r)$ in medium. The boundary condition contains the refractive index mismatch $\zeta$ and the outward boundary normal $\partial \nu$. For complex geometries as in most clinical applications, the DEs are solved numerically with finite-element-methods (FEM) which is based on mesh generation techniques for dividing a complex tissue volume, for example brain or breast, into small elements. An FEM based simulator NIRFAST is therefore utilized to model the light transport [20]. TD data in the form of histograms was transformed into frequency domain (FD) with MATLAB$'$s fast Fourier transform fft function and sampled at a number of selected frequencies for image reconstruction. Frequencies up to Nyquist $=f_s/2$ are theoretically suitable for reconstruction. However, the very high frequency region tend to be noisy and therefore are excluded. The forward problem in FD is described as follows [20],
$$[-\nabla\cdot\kappa(r)\nabla+\mu_a(r)+\frac{i\omega}{c_0(r)}]\Phi(r,\omega)=Q(r,\omega),$$
where $\Phi$ is the fluence and $Q$ is the source term in FD.

The image reconstruction or so-called inverse problem can be written as follows and for simplicity we omit the parameters $r$ and $\omega$ in the forward result $\Phi (r,\omega )$.

$$\mu^*= arg \min_{\mu} \{ \left\lVert{\Phi^S(\mu) - \Phi^M} \right\rVert ^2_2+\Gamma(\mu) \},$$
where $\mu$ is the three dimensional distribution of optical properties and the goal of the image reconstruction is to find the optimal $\mu ^*$ where the calculated forward result $\Phi ^S(\mu ^*)$ matches the measured data $\Phi ^M$. $\Gamma (\mu )$ is the regularization term and we use the Tikhonov regularization in this work. The optimization algorithm used iteratively updates $\mu$ to minimize the difference between $\Phi ^S$ and $\Phi ^M(\mu )$ and it terminates once the alteration of $\mu$ generates larger deviation than the one at the previous iteration. To resolve small details of tissue, a high signal-to-noise ratio (SNR) of the measured TD data is necessary. The image quality is impaired by systematic errors, e.g. the instrumental response function, inaccurate calibration of source detector positions, and superficial obstructions such as skin features (moles, birthmarks, hair). The contaminated signal from the true signal $f(t)$ can be expressed as $\tilde {f} = \alpha f(t) * g(t)$, where $\alpha$ stands for the damping factor and $g(t)$ means the instrumental response function. It is non-trivial to calibrate the system to extract the true signal from the contaminated measured data, which are encountered in clinical scenarios. We have previously studied an auto-calibration method called wavelength normalization (WN) for NIROT reconstruction with region-based prior [13]. In this section, we applied the method to tackle a complete 3D reconstruction problem without the need of calibration in advance for a 3D image reconstruction without any shape prior as in [13] or depth prior in [21]. We use a wavelength normalized (WN) data - the ratio of forward results at different wavelengths $R_i=\frac {\Phi ^{\lambda _i}}{\Phi ^{\lambda _0}}$ for the inverse problem where $\Phi ^{\lambda _i}$ is the reflectance for all source-detector pairs at wavelengths $\lambda _i$, $i=1,2,\ldots ,n$ in the multi-spectral measurement. When we treat the TD data in Fourier domain,
$$\tilde{F}(\omega) = \mathcal{F}{ [\alpha f(t) * g(t)]} = \alpha F(\omega) G(\omega),$$
the convolution becomes simple multiplication. The WN data is therefore as follows,
$$\tilde{R_i}(\omega) = \frac{ \tilde{F}(\omega)^{\lambda_i}}{ \tilde{F}(\omega)^{\lambda_0}} = \frac{ \alpha F(\omega)^{\lambda_i} G(\omega)^{\lambda_i}}{ \alpha F(\omega)^{\lambda_0} G(\omega)^{\lambda_0}} = \frac{ F(\omega)^{\lambda_i}}{ F(\omega)^{\lambda_0}} = {R_i}(\omega).$$
To eliminate the influence from IRF $g(t)$ or $G(\omega )$ in frequency domain, the system requires characteristics of wavelength-invariance in the time response. Our Pioneer system features wavelength-invariance in the IRF as illustrated in Fig. 2 where the laser-SPAD responses are almost identical at different wavelengths. The IRFs are therefore effortlessly cancelled out by taking the ratio. The damping factor $\alpha$ is usually wavelength invariant and it also can be eliminated simultaneously. Consequently we obtain a true signal $R_i$ from $\tilde {R_i}$ automatically. Thereafter, we reformulate the inverse problem as the optimization process to find the optimal set of $\mu$ distribution for both wavelengths $\mu ^{\lambda _i,\lambda _0 *}$ by minimizing the difference between reference measured ratio data $R_i^M$ and the ratio of the forward results $R_i^S(\mu ^{\lambda _i,\lambda _0})$ with estimated variables $\mu ^{\lambda _i,\lambda _0}$, being the updated optical properties for both wavelengths $\lambda _i$ and $\lambda _0$.
$$\mu^{\lambda_i,\lambda_0 *}= arg \min_{\mu^{\lambda_i,\lambda_0}} \{ \left\lVert{R_i^S(\mu^{\lambda_i,\lambda_0}) - R_i^M} \right\rVert ^2_2+\Gamma(\mu^{\lambda_i,\lambda_0}) \}.$$
For simplicity, we will assume $i=1$, but the solution can be easily extended to multiple-wavelength measurement. The target function for FEM-based image reconstruction algorithms therefore can be formulated as follows.
$$\chi^2=\sum_{j=1}^{N_m}{((R_1^S)_j-(R_1^M)_j)^2}+\beta\sum_{k=1}^{N_n}{((\mu^{\lambda_0,\lambda_1})_k-(\mu^{\lambda_0,\lambda_1})_0)^2}.$$
Tikhonov regularization is applied with parameter $\beta$. The goal of the inverse problem is to find the optimal set of optical properties $\mu$ in all $N_n$ nodes for both wavelengths $\lambda _1$ and $\lambda _0$ in order to match the simulated ratio data $R_1^S$ to the measured ratio $R_1^M$ for $N_m$ measurements, i.e. source-detector pairs. $(\mu ^{\lambda _0,\lambda _1})_0$ are the initial guess of optical properties for both wavelengths. To solve the above problem, a Newton like iterative process is taken to update the optical properties by quantities determined by
$$\delta \mu^{\lambda_0,\lambda_1}=(J_R^T J_R+{\beta} I)^{-1}J_R^T \delta R_1,$$
where $I$ is the identity matrix and $J_{R}$ is the Jacobian for the ratio data with respect to optical properties. The Jacobian $J_{R}$ consists of derivatives of logarithmic amplitude $ln \left |R_1 \right |$ and phase $\theta (R_1)$ with respect to the optical properties $\mu ^{\lambda _0}$, $\mu ^{\lambda _1}$ at two wavelengths. $ln$ denotes the natural (base $e$) logarithm. In this paper, the image reconstruction was performed with the assumption of the same scattering coefficient in the whole volume and $\mu$ therefore is written as $\mu _a$. We omit the explicit dependency of frequency in the notation for convenience. These quantities can be calculated from Jacobians for both wavelengths. We write the derivatives to form the Jacobian matrix with respect to $\mu _a$ for ratio data of log amplitude ($ln I^{\lambda _0}$, $ln I^{\lambda _0})$) and phase ($\theta ^{\lambda _0}$, $\theta ^{\lambda _1}$) as follows,
$$\frac{\partial (ln \left|R_1 \right|)}{\partial \mu_a^{\lambda_0}}=\frac{\partial (ln I^{\lambda_1}- ln I^{\lambda_0}) }{\partial \mu_a^{\lambda_0}}=-\frac{\partial (\ln I^{\lambda_0}) }{\partial \mu_a^{\lambda_0}} ,$$
$$\frac{\partial (ln \left|R_1 \right|)}{\partial \mu_a^{\lambda_1}}=\frac{\partial (ln I^{\lambda_1}- ln I^{\lambda_0}) }{\partial \mu_a^{\lambda_1}}=\frac{\partial (\ln I^{\lambda_1}) }{\partial \mu_a^{\lambda_1}},$$
$$\frac{\partial \theta( R_1 )}{\partial \mu_a^{\lambda_0}}=\frac{\partial (\theta^{\lambda_1}-\theta^{\lambda_0}) }{\partial \mu_a^{\lambda_0}}=-\frac{\partial (\theta^{\lambda_0}) }{\partial \mu_a^{\lambda_0}},$$
$$\frac{\partial \theta ( R_1)}{\partial \mu_a^{\lambda_1}}=\frac{\partial (\theta^{\lambda_1}-\theta^{\lambda_0}) }{\partial \mu_a^{\lambda_1}}=\frac{\partial (\theta^{\lambda_1}) }{\partial \mu_a^{\lambda_1}}.$$

The previously developed reconstruction algorithm with region priors includes only $\sim$10 unknowns [13]. The proposed new method for a 3D volume without any shape prior needs to solve thousands of unknowns, which is obviously more ill-posed. In addition, the TD data volume is compressed by taking the ratios of measured data at two wavelengths to obtain clean signals, which further makes the inverse problem more difficult to solve. However, the high number of channels and excellent sensitivity of the imager as in the Pioneer system provide rich enough information and therefore the accurate 3D reconstruction with the auto-calibrated method can be achieved.

 

Fig. 2. Laser-SPAD response functions at four sample wavelengths measured in reflection mode where the laser light was pointed to a white paper and the reflected light was captured by the SPAD camera; Note that the wavelength dependent dispersion in time, and additional time delays caused by different fiber lengths were eliminated by numerically applying pre-measured fixed time shifts to the TD data. The functions are so similar that with wavelength ratios they can be calibrated out easily.

Download Full Size | PPT Slide | PDF

2.3 Phantom experiment

To validate the NIROT system, reflected light was measured on a phantom of two small inclusions at two different depths (Fig. 3(a)). This poses a high difficulty for NIROT in reflection mode, because the sensitivity decreases with increasing depth for NIROT in reflection mode [16].

 

Fig. 3. (a) The cylindrical silicone phantom used in the experiment has a diameter of 114 mm and a height of 55 mm and both two small spheres have a radius of 5 mm. (b) The two inclusions were embedded at 10 mm and 15 mm, respectively and the lateral distance between them was 10 mm. (c) Mesh created in simulation for image reconstruction. Note that we made it smaller than the actual volume in (a): cylinder of diameter 60 mm and height 35 mm. 221 detectors were selected within a circle of diameter = $\sim$17.5 mm and all 11 sources were utilized.

Download Full Size | PPT Slide | PDF

As illustrated in Fig. 3(b), the silicone phantom contains two spheres of radius 5 mm embedded at 10 mm and 15 mm respectively. To produce the phantom, we firstly 3D printed molds for both two spherical inclusions and the cylinder bulk. We then prepared two mixtures of silicone material which have different absorption coefficients and similar scattering properties. The absorption of the material was tuned by adding black ink and RAL 6004 Blue Green (Wacker, Germany) in the silicone mixture and scattering was tuned by adding RAL 9010 White (Wacker, Germany). The silicone mixture of higher absorption (mixture S2) was cast into the spherical mold to produce two inclusions shown on the right of Fig. 3(a). These two spheres were then fixed to the cylindrical mold with fishing lines at the desired locations. Eventually a big cylindrical phantom (Fig. 3(a) Left) was produced by casting the less absorbing silicone (mixture S1) into the cylindrical mold with the two already embedded inclusions. To verify the target optical properties, a commercial device Imagent (ISS Inc., Champaign, IL, USA) was utilized. The device measured optical properties of a large homogeneous phantom. Therefore we cast two homogeneous phantoms from mixtures S1 and S2 in the shape of the cylindrical phantom (Fig. 3(a) Left). Two wavelengths $\lambda _1$ = 689 nm and $\lambda _2$ = 725 nm were chosen for the measurement to validate the WN and the measured optical properties with the Imagent are listed in Table 1.

In the phantom experiment, the detection probe was placed on the surface of the phantom as depicted in Fig. 1(b) and data acquisition was performed at 689 nm and 725 nm. The SPAD camera collected $2\times 10^8$ photons for each source which took $\sim$ 1 minute. During the process, the powermeter showed stable laser output. The variation of the average intensity of the laser did not exceed 2%. The variation of time delay is lower than the temporal resolution of the sensor and did not affect experimental results, i.e. it was below $\sim$100 ps. Eventually, 2431 histograms of good signal to noise ratio from 221 detectors within the circular FoV and 11 sources were selected. Dark photon counts were removed from each measured histogram by substracting its median value. Then the histograms which still contain errors from surface features such as dust and are distorted by the IRF were transformed into frequency domain in the image reconstruction. In this work, we used the frequencies 100 MHz, 200 MHz and 500 MHz. The initial regularization parameter was 10.

Tables Icon

Table 1. Optical properties of the silicone phantom

We also simulated the phantom experiment in order to validate the feasibility of the auto-calibrated image reconstruction with the same provided data volume. The current WN based reconstruction aims to recover a 3D distribution of optical properties at two wavelengths, although employing the data volume equivalent to a single wavelength reconstruction. This makes the reconstruction more difficult to achieve because the doubled number of unknowns increases the dimension of the optimization problem. To help the optimization algorithm to approach the correct solution easier, the absolute forward results at both wavelengths $\Phi _i$ with lower weights $\eta _i (0< \eta _i < 1, \ i = 0, 1)$ can be combined with the ratio data to enrich the target data as $F_1 = [R_1 \ \eta _1*\Phi _1 \ \eta _0*\Phi _0]$ for image reconstruction.

Therefore, the final merit function to minimize is

$$\mu^{\lambda_i,\lambda_0*}= arg \min_{\mu^{\lambda_i,\lambda_0}} \{ \left\lVert{F_1^S(\mu^{\lambda_i,\lambda_0}) - F_1^M} \right\rVert ^2_2+\Gamma(\mu) \}.$$
The values of weights depend on the signal quality of $\Phi _i$. The higher the $\eta _i$ is used, the more the inverse problem leans toward the original one described by Eq. (4). In this work, $\eta _i = 0.1$ was used.

2.4 Image evaluation

The quality of reconstructed images were assessed with two widely applied metrics: mean squared error (MSE) and contrast-to-noise ratio (CNR). The metrics MSE and CNR were applied on the reconstructed 3D map of $\hat {\mu }_a$ and ground truth ${\mu _a}$ defined as follows:

$${MSE}=\frac{1}{n}\sum_{i=1}^n({\mu_a}_i -\hat{\mu}_{ai} )^2$$
$${CNR} =10\times\log_{10}\left\{\frac{\sum_{i=1}^n(\hat{\mu}_{ai})^2}{\sum_{i=1}^n({\mu_a}_i-\hat{\mu}_{ai})^2}\right\}$$

MSE measures the difference between the actual values and the estimated values predicted by the reconstructions. CNR indicates the ability to visualize different tissues or inclusions from the background. Therefore, the image quality is in general better with lower MSE and higher CNR.

3. Results

3.1 Numerical validation

The phantom measurement was simulated in NIRFAST to validate the proposed imaging method. A cylindrical mesh of Ø$60\times 35$ $mm^3$ was generated to represent the phantom in the measurement and two spherical regions at two distinctive depths 10 mm and 15 mm with lateral separation of 10 mm were defined. Both spheres had the same radius of 5 mm. Note that the size of the simulated cylinder was smaller than the actual silicone phantom in the experiment to facilitate the simulation process. The optical properties were assigned to the bulk and to the 2 spheres according to Table 1 for the bulk and two spheres. As in the experiment, 11 light sources and 221 selected detectors within a circle of diameter = $\sim$17.5 mm were placed on the top surface of the mesh (Fig. 3(c)). A fine mesh of 100210 nodes was used to simulate the forward process. To decrease the ill-posedness of the inverse problem, a second coarser mesh of 942 nodes was generated and employed for the reconstruction. Technically, the use of a coarser mesh is computationally easier in terms of memory usage and speed.

The image reconstruction was completed with an initial guess of a homogeneous volume of the same $\mu _s'$ and $\mu _a$ as the bulk. The whole process underwent 20 iterations. To later compare with the experimental results, we denote the reconstruction results from simulated data as "Sim". Cross-sectional distributions of reconstructed values were plotted in Fig. 4(b)(f)(j)(n) for the two wavelengths. The shapes and positions of the inclusions were correctly recovered compared to the ground truth values (GT) (Fig. 4(a)(e)(i)(m). The successful 3D reconstruction by the simulation proved the theoretical validity of the proposed imaging methods.

 

Fig. 4. Reconstruction for the phantom of two inclusions at 689 nm and 725 nm. Crosssection of (a)(e)(i)(m) ground truth (GT) and (b)(g)(j)(n) reconstructed results for simulation (Sim), (c)(g)(k)(o) non-calibrated reconstruction and (d)(h)(l)(p) auto-calibrated reconstruction (MeasC) for the experimental TD data at two depths 10 mm and 15 mm; Note that the ground-truth images were generated with NIRFAST.

Download Full Size | PPT Slide | PDF

3.2 Experimental results

In the experiment, ToFs were acquired for 11 sources at two wavelengths. We show an example of ToFs from 3 detectors (pixels) located at different distances from the active light source (Fig. 5). Two types of reconstruction were performed: non-calibrated and auto-calibrated methods. For the non-calibrated case, TD data measured at 689 nm and 725 nm were employed directly without decoupling the IRFs or removal of artifacts such as abnormal weak signals in pixels due to dirt or dust on the phantom surface. Non-calibrated histograms were transformed in frequency domain as the target forward results. Images were reconstructed separately for the measurement at 689 nm and 725 nm. The homogenous case was the initial guesses where we assigned the same $\mu _a^{B}$ and $\mu _s'^{B}$ for all elements in the mesh. The same number of iterations of 20 and regularization parameter of 10 were used for both non-calibrated and auto-calibrated data. For the auto-calibrated case the ratio of the Fourier data at two wavelengths combined with weighted forward results was used as the target forward results. The inverse problem solver was the same for both the non-calibrated and the simulated case. For simplicity, we denote results related to the non-calibrated case as "Meas" and auto-calibrated case as "MeasC".

 

Fig. 5. ToF histograms from three pixels (locations shown in the inset) for the phantom measurement at 689 nm. The time bin size is 48.8 ps.

Download Full Size | PPT Slide | PDF

We plotted the cross-section of the reconstructed $\mu _a$ at depth 10 mm and 15 mm for both wavelengths in Fig. 4. The inclusions were clearly visible in both Sim (Fig. 4(b)(g)(j)(n)) and MeasC (Fig. 4(d)(h)(l)(p)) for both depths. However, the non-calibrated reconstruction failed to generate accurate shape of the inclusion at 10 mm depth (Fig. 4(c)(k)) and completely failed to generate the inclusion at the depth of 15 mm (Fig. 4(g)(o)). The one-dimensional distributions of $\mu _a$ were also depicted in Fig. 6. The values estimated from the auto-calibrated ratio data were closer to the ground truth than the non-calibrated ones at both depths and both wavelengths. The MSE and CNR of the reconstructed results are displayed in Fig. 7. MeasC outperformed Meas in both evaluations and is close to the Sim.

 

Fig. 6. One directional distribution of reconstructed $\mu _a$ (a) at depth of 10 mm and (b) 15 mm for 689 nm and (c) 10 mm and (d) 15 mm for 725 nm for the ground truth (GT), simulation (Sim), non-calibrated measurement (Meas) and auto-calibrated measurement (MeasC).

Download Full Size | PPT Slide | PDF

 

Fig. 7. Mean squared error (MSE) and contrast-to-noise ratio (CNR) for the reconstructed results at $689 nm$ and $725 nm$ for simulation (Sim), non-calibrated measurement (Meas) and auto-calibrated measurement (MeasC).

Download Full Size | PPT Slide | PDF

4. Discussion

This study presented an WN based image reconstruction method for the novel TD NIROT system Pioneer based on a SPAD camera. We validated the system in reflection-mode measurements on a silicone phantom with two small absorbers embedded at depths of 10 mm and 15 mm. The sensitivity, known as the amount of changes in the measured signals caused by the changes in optical properties, is lower in deeper tissue [16]. This makes the optimization algorithm in the image reconstruction easily approach a solution, which only recovers the shallower inclusion and omits the deeper inclusion. However, in our results both inclusions were evidently visible and their positions were accurately recovered. The spherical shape of the inclusion at 10 mm was perfectly reconstructed whereas the deeper one suffered some extent of deformation, which again was caused by the sensivity difference. One can further improve the depth sensivity by applying different optimization strategies. For example, the regularization method in this work was the common Tikhonov regularization. It is possible that the use of WN for image reconstructions with alternative regularization methods, e.g. spatial variant regularization [22] or region-based methods [23], may yield higher image accuracy.

Benefiting from the WN, Pioneer is able to be used without tedious prior calibration, which makes it more robust in a clinical environment. WN eliminates the IRFs automatically by taking the ratios of the Fourier transformed TD data obtained at multiple wavelengths. As our results show, the 3D image reconstruction based on WN clearly outperforms the non-calibrated one based on single wavelength. Despite the fact that in reality the IRF is never perfectly spectrally nor temporally invariant, the experimental results showed that the image reconstruction was much improved by WN without compensating for this variation. The image reconstruction might be even better if this effect was compensated better. Even though the inverse problem is more ill-posed due to the reduction in the data volume by taking the ratio, the rich information provided with TD NIROT enables an accurate solution of the inverse problem.

5. Conclusion

We propose a robust reconstruction method for the novel SPAD camera based TD NIROT system Pioneer. The large volume of data obtained with the SPAD camera enables recovering depth information in reflectance measurement. Reflectance is usually more accessible in clinical applications. The image reconstruction based on wavelength normalization approach avoids a tedious calibration in advance. It handles different sources of artifacts such as instrumental response functions and, we speculate, other linear artifacts, e.g. surface features. We validated the method on a phantom with two inclusions embedded at different depths and generated images of high quality. This brings the Pioneer system closer to the real clinical applications.

Funding

Krebsforschung Schweiz (KFS-3732-08-2015); Schweizerischer Nationalfonds zur Förderung der Wissenschaftlichen Forschung (159490); National Competence Center in Biomedical imaging (NCCBI); Consejo Nacional de Ciencia y Tecnología CONACyT (CVU-627802).

Disclosures

The authors declare the following conflicts of interest: MW is president of the board and co-founder of OxyPrem AG.

References

1. M. A. Franceschini, V. Toronov, M. E. Filiaci, E. Gratton, and S. Fantini, “On-line optical imaging of the human brain with 160-ms temporal resolution,” Opt. Express 6(3), 49–57 (2000). [CrossRef]  

2. V. Ntziachristos and B. Chance, “Probing physiology and molecular function using optical imaging: applications to breast cancer,” Breast Cancer Res. 3(1), 41–46 (2000). [CrossRef]  

3. D. A. Boas, D. H. Brooks, E. L. Miller, C. A. DiMarzio, M. Kilmer, R. J. Gaudette, and Z. Quan, “Imaging the body with diffuse optical tomography,” IEEE Signal Process. Mag. 18(6), 57–75 (2001). [CrossRef]  

4. S. R. Arridge, “Optical tomography in medical imaging,” Inverse Probl. 15(2), R41–R93 (1999). [CrossRef]  

5. A. T. Eggebrecht, S. L. Ferradal, A. Robichaux-Viehoever, M. S. Hassanpour, H. Dehghani, A. Z. Snyder, T. Hershey, and J. P. Culver, “Mapping distributed brain function and networks with diffuse optical tomography,” Nat. Photonics 8(6), 448–454 (2014). [CrossRef]  

6. E. Ferocino, A. Pifferi, S. Arridge, F. Martelli, P. Taroni, and A. Farina, “Multi simulation platform for time domain diffuse optical tomography: An application to a compact hand-held reflectance probe,” Appl. Sci. 9(14), 2849 (2019). [CrossRef]  

7. A. D. Mora, D. Contini, S. Arridge, F. Martelli, A. Tosi, G. Boso, A. Farina, T. Durduran, E. Martinenghi, A. Torricelli, and A. Pifferi, “Towards next-generation time-domain diffuse optics for extreme depth penetration and sensitivity,” Biomed. Opt. Express 6(5), 1749–1760 (2015). [CrossRef]  

8. E. M. Hillman, J. C. Hebden, M. Schweiger, H. Dehghani, F. E. Schmidt, D. T. Delpy, and S. R. Arridge, “Time resolved optical tomography of the human forearm,” Phys. Med. Biol. 46(4), 1117–1130 (2001). [CrossRef]  

9. S. Lindner, C. Zhang, I. M. Antolovic, A. Kalyanov, J. Jiang, L. Ahnen, A. di Costanzo, J. M. Pavia, S. S. Majos, E. Charbon, and M. Wolf, “A novel 32 × 32, 224 mevents/s time resolved spad image sensor for near-infrared optical tomography,” in Biophotonics Congress: Biomedical Optics Congress 2018 (Microscopy/Translational/Brain/OTS), (Optical Society of America, 2018), p. JTh5A.6.

10. A. Kalyanov, J. Jiang, S. Lindner, L. Ahnen, A. Di Costanzo, J. Mata Pavia, S. Sanchez Majos, and M. Wolf, “Time domain near-infrared optical tomography with time-of-flight spad camera: The new generation,” Biophotonics Congress: Biomedical Optics Congress 2018 (2018).

11. J. Jiang, A. di Costanzo, S. Lindner, M. Wolf, and A. Kalyanov, “Tracking objects in a diffusive medium with time domain near infrared optical tomography,” in Biophotonics Congress: Biomedical Optics 2020 (Translational, Microscopy, OCT, OTS, BRAIN), (Optical Society of America, 2020), p. JTu3A.18.

12. K. M. S. Uddin and Q. Zhu, “Reducing image artifact in diffuse optical tomography by iterative perturbation correction based on multiwavelength measurements,” J. Biomed. Opt. 24(5), 056005 (2019). [CrossRef]  

13. J. Jiang, M. Wolf, and S. S. Majos, “Fast reconstruction of optical properties for complex segmentations in near infrared imaging,” J. Mod. Opt. 64(7), 732–742 (2017). [CrossRef]  

14. M. A. Naser, M. S. Patterson, and J. W. Wong, “Self-calibrated algorithms for diffuse optical tomography and bioluminescence tomography using relative transmission images,” Biomed. Opt. Express 3(11), 2794–2808 (2012). [CrossRef]  

15. S. Wojtkiewicz, A. Gerega, M. Zanoletti, A. Sudakou, D. Contini, A. Liebert, T. Durduran, and H. Dehghani, “Self-calibrating time-resolved near infrared spectroscopy,” Biomed. Opt. Express 10(5), 2657–2669 (2019). [CrossRef]  

16. D. A. Boas and A. M. Dale, “Simulation study of magnetic resonance imaging-guided cortically constrained diffuse optical tomography of human brain function,” Appl. Opt. 44(10), 1957–1968 (2005). [CrossRef]  

17. L. Ahnen, H. Stachel, S. Kleiser, C. Hagmann, J. Jiang, A. Kalyanov, S. Lindner, M. Wolf, and S. Sanchez, Development and Validation of a Sensor Prototype for Near-Infrared Imaging of the Newborn Brain (Springer International Publishing, 2017), pp. 163–168.

18. C. Zhang, S. Lindner, I. M. Antolovic, M. Wolf, and E. Charbon, “A cmos spad imager with collision detection and 128 dynamically reallocating tdcs for single-photon counting and 3d time-of-flight imaging,” Sensors 18(11), 4016 (2018). [CrossRef]  

19. S. R. Arridge and M. Schweiger, A General Framework for Iterative Reconstruction Algorithms in Optical Tomography, Using a Finite Element Method (Springer New York, 1999), pp. 45–70.

20. H. Dehghani, M. E. Eames, P. K. Yalavarthy, S. C. Davis, S. Srinivasan, C. M. Carpenter, B. W. Pogue, and K. D. Paulsen, “Near infrared optical tomography using nirfast: Algorithm for numerical model and image reconstruction,” Commun. Numer. Meth. Engng. 25(6), 711–732 (2009). [CrossRef]  

21. Q. Zhao, L. Ji, and T. Jiang, “Improving depth resolution of diffuse optical tomography with a layer-based sigmoid adjustment method,” Opt. Express 15(7), 4018–4029 (2007). [CrossRef]  

22. B. W. Pogue, T. O. McBride, J. Prewitt, U. L. Osterberg, and K. D. Paulsen, “Spatially variant regularization improves diffuse optical tomography,” Appl. Opt. 38(13), 2950–2961 (1999). [CrossRef]  

23. J. Jiang, L. Ahnen, A. Kalyanov, S. Lindner, M. Wolf, and S. S. Majos, “A new method based on graphics processing units for fast near-infrared optical tomography,” Adv. Exp. Med. Biol. 977, 191–197 (2017).

References

  • View by:
  • |
  • |
  • |

  1. M. A. Franceschini, V. Toronov, M. E. Filiaci, E. Gratton, and S. Fantini, “On-line optical imaging of the human brain with 160-ms temporal resolution,” Opt. Express 6(3), 49–57 (2000).
    [Crossref]
  2. V. Ntziachristos and B. Chance, “Probing physiology and molecular function using optical imaging: applications to breast cancer,” Breast Cancer Res. 3(1), 41–46 (2000).
    [Crossref]
  3. D. A. Boas, D. H. Brooks, E. L. Miller, C. A. DiMarzio, M. Kilmer, R. J. Gaudette, and Z. Quan, “Imaging the body with diffuse optical tomography,” IEEE Signal Process. Mag. 18(6), 57–75 (2001).
    [Crossref]
  4. S. R. Arridge, “Optical tomography in medical imaging,” Inverse Probl. 15(2), R41–R93 (1999).
    [Crossref]
  5. A. T. Eggebrecht, S. L. Ferradal, A. Robichaux-Viehoever, M. S. Hassanpour, H. Dehghani, A. Z. Snyder, T. Hershey, and J. P. Culver, “Mapping distributed brain function and networks with diffuse optical tomography,” Nat. Photonics 8(6), 448–454 (2014).
    [Crossref]
  6. E. Ferocino, A. Pifferi, S. Arridge, F. Martelli, P. Taroni, and A. Farina, “Multi simulation platform for time domain diffuse optical tomography: An application to a compact hand-held reflectance probe,” Appl. Sci. 9(14), 2849 (2019).
    [Crossref]
  7. A. D. Mora, D. Contini, S. Arridge, F. Martelli, A. Tosi, G. Boso, A. Farina, T. Durduran, E. Martinenghi, A. Torricelli, and A. Pifferi, “Towards next-generation time-domain diffuse optics for extreme depth penetration and sensitivity,” Biomed. Opt. Express 6(5), 1749–1760 (2015).
    [Crossref]
  8. E. M. Hillman, J. C. Hebden, M. Schweiger, H. Dehghani, F. E. Schmidt, D. T. Delpy, and S. R. Arridge, “Time resolved optical tomography of the human forearm,” Phys. Med. Biol. 46(4), 1117–1130 (2001).
    [Crossref]
  9. S. Lindner, C. Zhang, I. M. Antolovic, A. Kalyanov, J. Jiang, L. Ahnen, A. di Costanzo, J. M. Pavia, S. S. Majos, E. Charbon, and M. Wolf, “A novel 32 × 32, 224 mevents/s time resolved spad image sensor for near-infrared optical tomography,” in Biophotonics Congress: Biomedical Optics Congress 2018 (Microscopy/Translational/Brain/OTS), (Optical Society of America, 2018), p. JTh5A.6.
  10. A. Kalyanov, J. Jiang, S. Lindner, L. Ahnen, A. Di Costanzo, J. Mata Pavia, S. Sanchez Majos, and M. Wolf, “Time domain near-infrared optical tomography with time-of-flight spad camera: The new generation,” Biophotonics Congress: Biomedical Optics Congress 2018 (2018).
  11. J. Jiang, A. di Costanzo, S. Lindner, M. Wolf, and A. Kalyanov, “Tracking objects in a diffusive medium with time domain near infrared optical tomography,” in Biophotonics Congress: Biomedical Optics 2020 (Translational, Microscopy, OCT, OTS, BRAIN), (Optical Society of America, 2020), p. JTu3A.18.
  12. K. M. S. Uddin and Q. Zhu, “Reducing image artifact in diffuse optical tomography by iterative perturbation correction based on multiwavelength measurements,” J. Biomed. Opt. 24(5), 056005 (2019).
    [Crossref]
  13. J. Jiang, M. Wolf, and S. S. Majos, “Fast reconstruction of optical properties for complex segmentations in near infrared imaging,” J. Mod. Opt. 64(7), 732–742 (2017).
    [Crossref]
  14. M. A. Naser, M. S. Patterson, and J. W. Wong, “Self-calibrated algorithms for diffuse optical tomography and bioluminescence tomography using relative transmission images,” Biomed. Opt. Express 3(11), 2794–2808 (2012).
    [Crossref]
  15. S. Wojtkiewicz, A. Gerega, M. Zanoletti, A. Sudakou, D. Contini, A. Liebert, T. Durduran, and H. Dehghani, “Self-calibrating time-resolved near infrared spectroscopy,” Biomed. Opt. Express 10(5), 2657–2669 (2019).
    [Crossref]
  16. D. A. Boas and A. M. Dale, “Simulation study of magnetic resonance imaging-guided cortically constrained diffuse optical tomography of human brain function,” Appl. Opt. 44(10), 1957–1968 (2005).
    [Crossref]
  17. L. Ahnen, H. Stachel, S. Kleiser, C. Hagmann, J. Jiang, A. Kalyanov, S. Lindner, M. Wolf, and S. Sanchez, Development and Validation of a Sensor Prototype for Near-Infrared Imaging of the Newborn Brain (Springer International Publishing, 2017), pp. 163–168.
  18. C. Zhang, S. Lindner, I. M. Antolovic, M. Wolf, and E. Charbon, “A cmos spad imager with collision detection and 128 dynamically reallocating tdcs for single-photon counting and 3d time-of-flight imaging,” Sensors 18(11), 4016 (2018).
    [Crossref]
  19. S. R. Arridge and M. Schweiger, A General Framework for Iterative Reconstruction Algorithms in Optical Tomography, Using a Finite Element Method (Springer New York, 1999), pp. 45–70.
  20. H. Dehghani, M. E. Eames, P. K. Yalavarthy, S. C. Davis, S. Srinivasan, C. M. Carpenter, B. W. Pogue, and K. D. Paulsen, “Near infrared optical tomography using nirfast: Algorithm for numerical model and image reconstruction,” Commun. Numer. Meth. Engng. 25(6), 711–732 (2009).
    [Crossref]
  21. Q. Zhao, L. Ji, and T. Jiang, “Improving depth resolution of diffuse optical tomography with a layer-based sigmoid adjustment method,” Opt. Express 15(7), 4018–4029 (2007).
    [Crossref]
  22. B. W. Pogue, T. O. McBride, J. Prewitt, U. L. Osterberg, and K. D. Paulsen, “Spatially variant regularization improves diffuse optical tomography,” Appl. Opt. 38(13), 2950–2961 (1999).
    [Crossref]
  23. J. Jiang, L. Ahnen, A. Kalyanov, S. Lindner, M. Wolf, and S. S. Majos, “A new method based on graphics processing units for fast near-infrared optical tomography,” Adv. Exp. Med. Biol. 977, 191–197 (2017).

2019 (3)

E. Ferocino, A. Pifferi, S. Arridge, F. Martelli, P. Taroni, and A. Farina, “Multi simulation platform for time domain diffuse optical tomography: An application to a compact hand-held reflectance probe,” Appl. Sci. 9(14), 2849 (2019).
[Crossref]

S. Wojtkiewicz, A. Gerega, M. Zanoletti, A. Sudakou, D. Contini, A. Liebert, T. Durduran, and H. Dehghani, “Self-calibrating time-resolved near infrared spectroscopy,” Biomed. Opt. Express 10(5), 2657–2669 (2019).
[Crossref]

K. M. S. Uddin and Q. Zhu, “Reducing image artifact in diffuse optical tomography by iterative perturbation correction based on multiwavelength measurements,” J. Biomed. Opt. 24(5), 056005 (2019).
[Crossref]

2018 (1)

C. Zhang, S. Lindner, I. M. Antolovic, M. Wolf, and E. Charbon, “A cmos spad imager with collision detection and 128 dynamically reallocating tdcs for single-photon counting and 3d time-of-flight imaging,” Sensors 18(11), 4016 (2018).
[Crossref]

2017 (2)

J. Jiang, M. Wolf, and S. S. Majos, “Fast reconstruction of optical properties for complex segmentations in near infrared imaging,” J. Mod. Opt. 64(7), 732–742 (2017).
[Crossref]

J. Jiang, L. Ahnen, A. Kalyanov, S. Lindner, M. Wolf, and S. S. Majos, “A new method based on graphics processing units for fast near-infrared optical tomography,” Adv. Exp. Med. Biol. 977, 191–197 (2017).

2015 (1)

2014 (1)

A. T. Eggebrecht, S. L. Ferradal, A. Robichaux-Viehoever, M. S. Hassanpour, H. Dehghani, A. Z. Snyder, T. Hershey, and J. P. Culver, “Mapping distributed brain function and networks with diffuse optical tomography,” Nat. Photonics 8(6), 448–454 (2014).
[Crossref]

2012 (1)

2009 (1)

H. Dehghani, M. E. Eames, P. K. Yalavarthy, S. C. Davis, S. Srinivasan, C. M. Carpenter, B. W. Pogue, and K. D. Paulsen, “Near infrared optical tomography using nirfast: Algorithm for numerical model and image reconstruction,” Commun. Numer. Meth. Engng. 25(6), 711–732 (2009).
[Crossref]

2007 (1)

2005 (1)

2001 (2)

D. A. Boas, D. H. Brooks, E. L. Miller, C. A. DiMarzio, M. Kilmer, R. J. Gaudette, and Z. Quan, “Imaging the body with diffuse optical tomography,” IEEE Signal Process. Mag. 18(6), 57–75 (2001).
[Crossref]

E. M. Hillman, J. C. Hebden, M. Schweiger, H. Dehghani, F. E. Schmidt, D. T. Delpy, and S. R. Arridge, “Time resolved optical tomography of the human forearm,” Phys. Med. Biol. 46(4), 1117–1130 (2001).
[Crossref]

2000 (2)

V. Ntziachristos and B. Chance, “Probing physiology and molecular function using optical imaging: applications to breast cancer,” Breast Cancer Res. 3(1), 41–46 (2000).
[Crossref]

M. A. Franceschini, V. Toronov, M. E. Filiaci, E. Gratton, and S. Fantini, “On-line optical imaging of the human brain with 160-ms temporal resolution,” Opt. Express 6(3), 49–57 (2000).
[Crossref]

1999 (2)

Ahnen, L.

J. Jiang, L. Ahnen, A. Kalyanov, S. Lindner, M. Wolf, and S. S. Majos, “A new method based on graphics processing units for fast near-infrared optical tomography,” Adv. Exp. Med. Biol. 977, 191–197 (2017).

L. Ahnen, H. Stachel, S. Kleiser, C. Hagmann, J. Jiang, A. Kalyanov, S. Lindner, M. Wolf, and S. Sanchez, Development and Validation of a Sensor Prototype for Near-Infrared Imaging of the Newborn Brain (Springer International Publishing, 2017), pp. 163–168.

S. Lindner, C. Zhang, I. M. Antolovic, A. Kalyanov, J. Jiang, L. Ahnen, A. di Costanzo, J. M. Pavia, S. S. Majos, E. Charbon, and M. Wolf, “A novel 32 × 32, 224 mevents/s time resolved spad image sensor for near-infrared optical tomography,” in Biophotonics Congress: Biomedical Optics Congress 2018 (Microscopy/Translational/Brain/OTS), (Optical Society of America, 2018), p. JTh5A.6.

A. Kalyanov, J. Jiang, S. Lindner, L. Ahnen, A. Di Costanzo, J. Mata Pavia, S. Sanchez Majos, and M. Wolf, “Time domain near-infrared optical tomography with time-of-flight spad camera: The new generation,” Biophotonics Congress: Biomedical Optics Congress 2018 (2018).

Antolovic, I. M.

C. Zhang, S. Lindner, I. M. Antolovic, M. Wolf, and E. Charbon, “A cmos spad imager with collision detection and 128 dynamically reallocating tdcs for single-photon counting and 3d time-of-flight imaging,” Sensors 18(11), 4016 (2018).
[Crossref]

S. Lindner, C. Zhang, I. M. Antolovic, A. Kalyanov, J. Jiang, L. Ahnen, A. di Costanzo, J. M. Pavia, S. S. Majos, E. Charbon, and M. Wolf, “A novel 32 × 32, 224 mevents/s time resolved spad image sensor for near-infrared optical tomography,” in Biophotonics Congress: Biomedical Optics Congress 2018 (Microscopy/Translational/Brain/OTS), (Optical Society of America, 2018), p. JTh5A.6.

Arridge, S.

E. Ferocino, A. Pifferi, S. Arridge, F. Martelli, P. Taroni, and A. Farina, “Multi simulation platform for time domain diffuse optical tomography: An application to a compact hand-held reflectance probe,” Appl. Sci. 9(14), 2849 (2019).
[Crossref]

A. D. Mora, D. Contini, S. Arridge, F. Martelli, A. Tosi, G. Boso, A. Farina, T. Durduran, E. Martinenghi, A. Torricelli, and A. Pifferi, “Towards next-generation time-domain diffuse optics for extreme depth penetration and sensitivity,” Biomed. Opt. Express 6(5), 1749–1760 (2015).
[Crossref]

Arridge, S. R.

E. M. Hillman, J. C. Hebden, M. Schweiger, H. Dehghani, F. E. Schmidt, D. T. Delpy, and S. R. Arridge, “Time resolved optical tomography of the human forearm,” Phys. Med. Biol. 46(4), 1117–1130 (2001).
[Crossref]

S. R. Arridge, “Optical tomography in medical imaging,” Inverse Probl. 15(2), R41–R93 (1999).
[Crossref]

S. R. Arridge and M. Schweiger, A General Framework for Iterative Reconstruction Algorithms in Optical Tomography, Using a Finite Element Method (Springer New York, 1999), pp. 45–70.

Boas, D. A.

D. A. Boas and A. M. Dale, “Simulation study of magnetic resonance imaging-guided cortically constrained diffuse optical tomography of human brain function,” Appl. Opt. 44(10), 1957–1968 (2005).
[Crossref]

D. A. Boas, D. H. Brooks, E. L. Miller, C. A. DiMarzio, M. Kilmer, R. J. Gaudette, and Z. Quan, “Imaging the body with diffuse optical tomography,” IEEE Signal Process. Mag. 18(6), 57–75 (2001).
[Crossref]

Boso, G.

Brooks, D. H.

D. A. Boas, D. H. Brooks, E. L. Miller, C. A. DiMarzio, M. Kilmer, R. J. Gaudette, and Z. Quan, “Imaging the body with diffuse optical tomography,” IEEE Signal Process. Mag. 18(6), 57–75 (2001).
[Crossref]

Carpenter, C. M.

H. Dehghani, M. E. Eames, P. K. Yalavarthy, S. C. Davis, S. Srinivasan, C. M. Carpenter, B. W. Pogue, and K. D. Paulsen, “Near infrared optical tomography using nirfast: Algorithm for numerical model and image reconstruction,” Commun. Numer. Meth. Engng. 25(6), 711–732 (2009).
[Crossref]

Chance, B.

V. Ntziachristos and B. Chance, “Probing physiology and molecular function using optical imaging: applications to breast cancer,” Breast Cancer Res. 3(1), 41–46 (2000).
[Crossref]

Charbon, E.

C. Zhang, S. Lindner, I. M. Antolovic, M. Wolf, and E. Charbon, “A cmos spad imager with collision detection and 128 dynamically reallocating tdcs for single-photon counting and 3d time-of-flight imaging,” Sensors 18(11), 4016 (2018).
[Crossref]

S. Lindner, C. Zhang, I. M. Antolovic, A. Kalyanov, J. Jiang, L. Ahnen, A. di Costanzo, J. M. Pavia, S. S. Majos, E. Charbon, and M. Wolf, “A novel 32 × 32, 224 mevents/s time resolved spad image sensor for near-infrared optical tomography,” in Biophotonics Congress: Biomedical Optics Congress 2018 (Microscopy/Translational/Brain/OTS), (Optical Society of America, 2018), p. JTh5A.6.

Contini, D.

Culver, J. P.

A. T. Eggebrecht, S. L. Ferradal, A. Robichaux-Viehoever, M. S. Hassanpour, H. Dehghani, A. Z. Snyder, T. Hershey, and J. P. Culver, “Mapping distributed brain function and networks with diffuse optical tomography,” Nat. Photonics 8(6), 448–454 (2014).
[Crossref]

Dale, A. M.

Davis, S. C.

H. Dehghani, M. E. Eames, P. K. Yalavarthy, S. C. Davis, S. Srinivasan, C. M. Carpenter, B. W. Pogue, and K. D. Paulsen, “Near infrared optical tomography using nirfast: Algorithm for numerical model and image reconstruction,” Commun. Numer. Meth. Engng. 25(6), 711–732 (2009).
[Crossref]

Dehghani, H.

S. Wojtkiewicz, A. Gerega, M. Zanoletti, A. Sudakou, D. Contini, A. Liebert, T. Durduran, and H. Dehghani, “Self-calibrating time-resolved near infrared spectroscopy,” Biomed. Opt. Express 10(5), 2657–2669 (2019).
[Crossref]

A. T. Eggebrecht, S. L. Ferradal, A. Robichaux-Viehoever, M. S. Hassanpour, H. Dehghani, A. Z. Snyder, T. Hershey, and J. P. Culver, “Mapping distributed brain function and networks with diffuse optical tomography,” Nat. Photonics 8(6), 448–454 (2014).
[Crossref]

H. Dehghani, M. E. Eames, P. K. Yalavarthy, S. C. Davis, S. Srinivasan, C. M. Carpenter, B. W. Pogue, and K. D. Paulsen, “Near infrared optical tomography using nirfast: Algorithm for numerical model and image reconstruction,” Commun. Numer. Meth. Engng. 25(6), 711–732 (2009).
[Crossref]

E. M. Hillman, J. C. Hebden, M. Schweiger, H. Dehghani, F. E. Schmidt, D. T. Delpy, and S. R. Arridge, “Time resolved optical tomography of the human forearm,” Phys. Med. Biol. 46(4), 1117–1130 (2001).
[Crossref]

Delpy, D. T.

E. M. Hillman, J. C. Hebden, M. Schweiger, H. Dehghani, F. E. Schmidt, D. T. Delpy, and S. R. Arridge, “Time resolved optical tomography of the human forearm,” Phys. Med. Biol. 46(4), 1117–1130 (2001).
[Crossref]

di Costanzo, A.

S. Lindner, C. Zhang, I. M. Antolovic, A. Kalyanov, J. Jiang, L. Ahnen, A. di Costanzo, J. M. Pavia, S. S. Majos, E. Charbon, and M. Wolf, “A novel 32 × 32, 224 mevents/s time resolved spad image sensor for near-infrared optical tomography,” in Biophotonics Congress: Biomedical Optics Congress 2018 (Microscopy/Translational/Brain/OTS), (Optical Society of America, 2018), p. JTh5A.6.

A. Kalyanov, J. Jiang, S. Lindner, L. Ahnen, A. Di Costanzo, J. Mata Pavia, S. Sanchez Majos, and M. Wolf, “Time domain near-infrared optical tomography with time-of-flight spad camera: The new generation,” Biophotonics Congress: Biomedical Optics Congress 2018 (2018).

J. Jiang, A. di Costanzo, S. Lindner, M. Wolf, and A. Kalyanov, “Tracking objects in a diffusive medium with time domain near infrared optical tomography,” in Biophotonics Congress: Biomedical Optics 2020 (Translational, Microscopy, OCT, OTS, BRAIN), (Optical Society of America, 2020), p. JTu3A.18.

DiMarzio, C. A.

D. A. Boas, D. H. Brooks, E. L. Miller, C. A. DiMarzio, M. Kilmer, R. J. Gaudette, and Z. Quan, “Imaging the body with diffuse optical tomography,” IEEE Signal Process. Mag. 18(6), 57–75 (2001).
[Crossref]

Durduran, T.

Eames, M. E.

H. Dehghani, M. E. Eames, P. K. Yalavarthy, S. C. Davis, S. Srinivasan, C. M. Carpenter, B. W. Pogue, and K. D. Paulsen, “Near infrared optical tomography using nirfast: Algorithm for numerical model and image reconstruction,” Commun. Numer. Meth. Engng. 25(6), 711–732 (2009).
[Crossref]

Eggebrecht, A. T.

A. T. Eggebrecht, S. L. Ferradal, A. Robichaux-Viehoever, M. S. Hassanpour, H. Dehghani, A. Z. Snyder, T. Hershey, and J. P. Culver, “Mapping distributed brain function and networks with diffuse optical tomography,” Nat. Photonics 8(6), 448–454 (2014).
[Crossref]

Fantini, S.

Farina, A.

E. Ferocino, A. Pifferi, S. Arridge, F. Martelli, P. Taroni, and A. Farina, “Multi simulation platform for time domain diffuse optical tomography: An application to a compact hand-held reflectance probe,” Appl. Sci. 9(14), 2849 (2019).
[Crossref]

A. D. Mora, D. Contini, S. Arridge, F. Martelli, A. Tosi, G. Boso, A. Farina, T. Durduran, E. Martinenghi, A. Torricelli, and A. Pifferi, “Towards next-generation time-domain diffuse optics for extreme depth penetration and sensitivity,” Biomed. Opt. Express 6(5), 1749–1760 (2015).
[Crossref]

Ferocino, E.

E. Ferocino, A. Pifferi, S. Arridge, F. Martelli, P. Taroni, and A. Farina, “Multi simulation platform for time domain diffuse optical tomography: An application to a compact hand-held reflectance probe,” Appl. Sci. 9(14), 2849 (2019).
[Crossref]

Ferradal, S. L.

A. T. Eggebrecht, S. L. Ferradal, A. Robichaux-Viehoever, M. S. Hassanpour, H. Dehghani, A. Z. Snyder, T. Hershey, and J. P. Culver, “Mapping distributed brain function and networks with diffuse optical tomography,” Nat. Photonics 8(6), 448–454 (2014).
[Crossref]

Filiaci, M. E.

Franceschini, M. A.

Gaudette, R. J.

D. A. Boas, D. H. Brooks, E. L. Miller, C. A. DiMarzio, M. Kilmer, R. J. Gaudette, and Z. Quan, “Imaging the body with diffuse optical tomography,” IEEE Signal Process. Mag. 18(6), 57–75 (2001).
[Crossref]

Gerega, A.

Gratton, E.

Hagmann, C.

L. Ahnen, H. Stachel, S. Kleiser, C. Hagmann, J. Jiang, A. Kalyanov, S. Lindner, M. Wolf, and S. Sanchez, Development and Validation of a Sensor Prototype for Near-Infrared Imaging of the Newborn Brain (Springer International Publishing, 2017), pp. 163–168.

Hassanpour, M. S.

A. T. Eggebrecht, S. L. Ferradal, A. Robichaux-Viehoever, M. S. Hassanpour, H. Dehghani, A. Z. Snyder, T. Hershey, and J. P. Culver, “Mapping distributed brain function and networks with diffuse optical tomography,” Nat. Photonics 8(6), 448–454 (2014).
[Crossref]

Hebden, J. C.

E. M. Hillman, J. C. Hebden, M. Schweiger, H. Dehghani, F. E. Schmidt, D. T. Delpy, and S. R. Arridge, “Time resolved optical tomography of the human forearm,” Phys. Med. Biol. 46(4), 1117–1130 (2001).
[Crossref]

Hershey, T.

A. T. Eggebrecht, S. L. Ferradal, A. Robichaux-Viehoever, M. S. Hassanpour, H. Dehghani, A. Z. Snyder, T. Hershey, and J. P. Culver, “Mapping distributed brain function and networks with diffuse optical tomography,” Nat. Photonics 8(6), 448–454 (2014).
[Crossref]

Hillman, E. M.

E. M. Hillman, J. C. Hebden, M. Schweiger, H. Dehghani, F. E. Schmidt, D. T. Delpy, and S. R. Arridge, “Time resolved optical tomography of the human forearm,” Phys. Med. Biol. 46(4), 1117–1130 (2001).
[Crossref]

Ji, L.

Jiang, J.

J. Jiang, L. Ahnen, A. Kalyanov, S. Lindner, M. Wolf, and S. S. Majos, “A new method based on graphics processing units for fast near-infrared optical tomography,” Adv. Exp. Med. Biol. 977, 191–197 (2017).

J. Jiang, M. Wolf, and S. S. Majos, “Fast reconstruction of optical properties for complex segmentations in near infrared imaging,” J. Mod. Opt. 64(7), 732–742 (2017).
[Crossref]

A. Kalyanov, J. Jiang, S. Lindner, L. Ahnen, A. Di Costanzo, J. Mata Pavia, S. Sanchez Majos, and M. Wolf, “Time domain near-infrared optical tomography with time-of-flight spad camera: The new generation,” Biophotonics Congress: Biomedical Optics Congress 2018 (2018).

S. Lindner, C. Zhang, I. M. Antolovic, A. Kalyanov, J. Jiang, L. Ahnen, A. di Costanzo, J. M. Pavia, S. S. Majos, E. Charbon, and M. Wolf, “A novel 32 × 32, 224 mevents/s time resolved spad image sensor for near-infrared optical tomography,” in Biophotonics Congress: Biomedical Optics Congress 2018 (Microscopy/Translational/Brain/OTS), (Optical Society of America, 2018), p. JTh5A.6.

L. Ahnen, H. Stachel, S. Kleiser, C. Hagmann, J. Jiang, A. Kalyanov, S. Lindner, M. Wolf, and S. Sanchez, Development and Validation of a Sensor Prototype for Near-Infrared Imaging of the Newborn Brain (Springer International Publishing, 2017), pp. 163–168.

J. Jiang, A. di Costanzo, S. Lindner, M. Wolf, and A. Kalyanov, “Tracking objects in a diffusive medium with time domain near infrared optical tomography,” in Biophotonics Congress: Biomedical Optics 2020 (Translational, Microscopy, OCT, OTS, BRAIN), (Optical Society of America, 2020), p. JTu3A.18.

Jiang, T.

Kalyanov, A.

J. Jiang, L. Ahnen, A. Kalyanov, S. Lindner, M. Wolf, and S. S. Majos, “A new method based on graphics processing units for fast near-infrared optical tomography,” Adv. Exp. Med. Biol. 977, 191–197 (2017).

L. Ahnen, H. Stachel, S. Kleiser, C. Hagmann, J. Jiang, A. Kalyanov, S. Lindner, M. Wolf, and S. Sanchez, Development and Validation of a Sensor Prototype for Near-Infrared Imaging of the Newborn Brain (Springer International Publishing, 2017), pp. 163–168.

S. Lindner, C. Zhang, I. M. Antolovic, A. Kalyanov, J. Jiang, L. Ahnen, A. di Costanzo, J. M. Pavia, S. S. Majos, E. Charbon, and M. Wolf, “A novel 32 × 32, 224 mevents/s time resolved spad image sensor for near-infrared optical tomography,” in Biophotonics Congress: Biomedical Optics Congress 2018 (Microscopy/Translational/Brain/OTS), (Optical Society of America, 2018), p. JTh5A.6.

J. Jiang, A. di Costanzo, S. Lindner, M. Wolf, and A. Kalyanov, “Tracking objects in a diffusive medium with time domain near infrared optical tomography,” in Biophotonics Congress: Biomedical Optics 2020 (Translational, Microscopy, OCT, OTS, BRAIN), (Optical Society of America, 2020), p. JTu3A.18.

A. Kalyanov, J. Jiang, S. Lindner, L. Ahnen, A. Di Costanzo, J. Mata Pavia, S. Sanchez Majos, and M. Wolf, “Time domain near-infrared optical tomography with time-of-flight spad camera: The new generation,” Biophotonics Congress: Biomedical Optics Congress 2018 (2018).

Kilmer, M.

D. A. Boas, D. H. Brooks, E. L. Miller, C. A. DiMarzio, M. Kilmer, R. J. Gaudette, and Z. Quan, “Imaging the body with diffuse optical tomography,” IEEE Signal Process. Mag. 18(6), 57–75 (2001).
[Crossref]

Kleiser, S.

L. Ahnen, H. Stachel, S. Kleiser, C. Hagmann, J. Jiang, A. Kalyanov, S. Lindner, M. Wolf, and S. Sanchez, Development and Validation of a Sensor Prototype for Near-Infrared Imaging of the Newborn Brain (Springer International Publishing, 2017), pp. 163–168.

Liebert, A.

Lindner, S.

C. Zhang, S. Lindner, I. M. Antolovic, M. Wolf, and E. Charbon, “A cmos spad imager with collision detection and 128 dynamically reallocating tdcs for single-photon counting and 3d time-of-flight imaging,” Sensors 18(11), 4016 (2018).
[Crossref]

J. Jiang, L. Ahnen, A. Kalyanov, S. Lindner, M. Wolf, and S. S. Majos, “A new method based on graphics processing units for fast near-infrared optical tomography,” Adv. Exp. Med. Biol. 977, 191–197 (2017).

L. Ahnen, H. Stachel, S. Kleiser, C. Hagmann, J. Jiang, A. Kalyanov, S. Lindner, M. Wolf, and S. Sanchez, Development and Validation of a Sensor Prototype for Near-Infrared Imaging of the Newborn Brain (Springer International Publishing, 2017), pp. 163–168.

A. Kalyanov, J. Jiang, S. Lindner, L. Ahnen, A. Di Costanzo, J. Mata Pavia, S. Sanchez Majos, and M. Wolf, “Time domain near-infrared optical tomography with time-of-flight spad camera: The new generation,” Biophotonics Congress: Biomedical Optics Congress 2018 (2018).

J. Jiang, A. di Costanzo, S. Lindner, M. Wolf, and A. Kalyanov, “Tracking objects in a diffusive medium with time domain near infrared optical tomography,” in Biophotonics Congress: Biomedical Optics 2020 (Translational, Microscopy, OCT, OTS, BRAIN), (Optical Society of America, 2020), p. JTu3A.18.

S. Lindner, C. Zhang, I. M. Antolovic, A. Kalyanov, J. Jiang, L. Ahnen, A. di Costanzo, J. M. Pavia, S. S. Majos, E. Charbon, and M. Wolf, “A novel 32 × 32, 224 mevents/s time resolved spad image sensor for near-infrared optical tomography,” in Biophotonics Congress: Biomedical Optics Congress 2018 (Microscopy/Translational/Brain/OTS), (Optical Society of America, 2018), p. JTh5A.6.

Majos, S. S.

J. Jiang, M. Wolf, and S. S. Majos, “Fast reconstruction of optical properties for complex segmentations in near infrared imaging,” J. Mod. Opt. 64(7), 732–742 (2017).
[Crossref]

J. Jiang, L. Ahnen, A. Kalyanov, S. Lindner, M. Wolf, and S. S. Majos, “A new method based on graphics processing units for fast near-infrared optical tomography,” Adv. Exp. Med. Biol. 977, 191–197 (2017).

S. Lindner, C. Zhang, I. M. Antolovic, A. Kalyanov, J. Jiang, L. Ahnen, A. di Costanzo, J. M. Pavia, S. S. Majos, E. Charbon, and M. Wolf, “A novel 32 × 32, 224 mevents/s time resolved spad image sensor for near-infrared optical tomography,” in Biophotonics Congress: Biomedical Optics Congress 2018 (Microscopy/Translational/Brain/OTS), (Optical Society of America, 2018), p. JTh5A.6.

Martelli, F.

E. Ferocino, A. Pifferi, S. Arridge, F. Martelli, P. Taroni, and A. Farina, “Multi simulation platform for time domain diffuse optical tomography: An application to a compact hand-held reflectance probe,” Appl. Sci. 9(14), 2849 (2019).
[Crossref]

A. D. Mora, D. Contini, S. Arridge, F. Martelli, A. Tosi, G. Boso, A. Farina, T. Durduran, E. Martinenghi, A. Torricelli, and A. Pifferi, “Towards next-generation time-domain diffuse optics for extreme depth penetration and sensitivity,” Biomed. Opt. Express 6(5), 1749–1760 (2015).
[Crossref]

Martinenghi, E.

Mata Pavia, J.

A. Kalyanov, J. Jiang, S. Lindner, L. Ahnen, A. Di Costanzo, J. Mata Pavia, S. Sanchez Majos, and M. Wolf, “Time domain near-infrared optical tomography with time-of-flight spad camera: The new generation,” Biophotonics Congress: Biomedical Optics Congress 2018 (2018).

McBride, T. O.

Miller, E. L.

D. A. Boas, D. H. Brooks, E. L. Miller, C. A. DiMarzio, M. Kilmer, R. J. Gaudette, and Z. Quan, “Imaging the body with diffuse optical tomography,” IEEE Signal Process. Mag. 18(6), 57–75 (2001).
[Crossref]

Mora, A. D.

Naser, M. A.

Ntziachristos, V.

V. Ntziachristos and B. Chance, “Probing physiology and molecular function using optical imaging: applications to breast cancer,” Breast Cancer Res. 3(1), 41–46 (2000).
[Crossref]

Osterberg, U. L.

Patterson, M. S.

Paulsen, K. D.

H. Dehghani, M. E. Eames, P. K. Yalavarthy, S. C. Davis, S. Srinivasan, C. M. Carpenter, B. W. Pogue, and K. D. Paulsen, “Near infrared optical tomography using nirfast: Algorithm for numerical model and image reconstruction,” Commun. Numer. Meth. Engng. 25(6), 711–732 (2009).
[Crossref]

B. W. Pogue, T. O. McBride, J. Prewitt, U. L. Osterberg, and K. D. Paulsen, “Spatially variant regularization improves diffuse optical tomography,” Appl. Opt. 38(13), 2950–2961 (1999).
[Crossref]

Pavia, J. M.

S. Lindner, C. Zhang, I. M. Antolovic, A. Kalyanov, J. Jiang, L. Ahnen, A. di Costanzo, J. M. Pavia, S. S. Majos, E. Charbon, and M. Wolf, “A novel 32 × 32, 224 mevents/s time resolved spad image sensor for near-infrared optical tomography,” in Biophotonics Congress: Biomedical Optics Congress 2018 (Microscopy/Translational/Brain/OTS), (Optical Society of America, 2018), p. JTh5A.6.

Pifferi, A.

E. Ferocino, A. Pifferi, S. Arridge, F. Martelli, P. Taroni, and A. Farina, “Multi simulation platform for time domain diffuse optical tomography: An application to a compact hand-held reflectance probe,” Appl. Sci. 9(14), 2849 (2019).
[Crossref]

A. D. Mora, D. Contini, S. Arridge, F. Martelli, A. Tosi, G. Boso, A. Farina, T. Durduran, E. Martinenghi, A. Torricelli, and A. Pifferi, “Towards next-generation time-domain diffuse optics for extreme depth penetration and sensitivity,” Biomed. Opt. Express 6(5), 1749–1760 (2015).
[Crossref]

Pogue, B. W.

H. Dehghani, M. E. Eames, P. K. Yalavarthy, S. C. Davis, S. Srinivasan, C. M. Carpenter, B. W. Pogue, and K. D. Paulsen, “Near infrared optical tomography using nirfast: Algorithm for numerical model and image reconstruction,” Commun. Numer. Meth. Engng. 25(6), 711–732 (2009).
[Crossref]

B. W. Pogue, T. O. McBride, J. Prewitt, U. L. Osterberg, and K. D. Paulsen, “Spatially variant regularization improves diffuse optical tomography,” Appl. Opt. 38(13), 2950–2961 (1999).
[Crossref]

Prewitt, J.

Quan, Z.

D. A. Boas, D. H. Brooks, E. L. Miller, C. A. DiMarzio, M. Kilmer, R. J. Gaudette, and Z. Quan, “Imaging the body with diffuse optical tomography,” IEEE Signal Process. Mag. 18(6), 57–75 (2001).
[Crossref]

Robichaux-Viehoever, A.

A. T. Eggebrecht, S. L. Ferradal, A. Robichaux-Viehoever, M. S. Hassanpour, H. Dehghani, A. Z. Snyder, T. Hershey, and J. P. Culver, “Mapping distributed brain function and networks with diffuse optical tomography,” Nat. Photonics 8(6), 448–454 (2014).
[Crossref]

Sanchez, S.

L. Ahnen, H. Stachel, S. Kleiser, C. Hagmann, J. Jiang, A. Kalyanov, S. Lindner, M. Wolf, and S. Sanchez, Development and Validation of a Sensor Prototype for Near-Infrared Imaging of the Newborn Brain (Springer International Publishing, 2017), pp. 163–168.

Sanchez Majos, S.

A. Kalyanov, J. Jiang, S. Lindner, L. Ahnen, A. Di Costanzo, J. Mata Pavia, S. Sanchez Majos, and M. Wolf, “Time domain near-infrared optical tomography with time-of-flight spad camera: The new generation,” Biophotonics Congress: Biomedical Optics Congress 2018 (2018).

Schmidt, F. E.

E. M. Hillman, J. C. Hebden, M. Schweiger, H. Dehghani, F. E. Schmidt, D. T. Delpy, and S. R. Arridge, “Time resolved optical tomography of the human forearm,” Phys. Med. Biol. 46(4), 1117–1130 (2001).
[Crossref]

Schweiger, M.

E. M. Hillman, J. C. Hebden, M. Schweiger, H. Dehghani, F. E. Schmidt, D. T. Delpy, and S. R. Arridge, “Time resolved optical tomography of the human forearm,” Phys. Med. Biol. 46(4), 1117–1130 (2001).
[Crossref]

S. R. Arridge and M. Schweiger, A General Framework for Iterative Reconstruction Algorithms in Optical Tomography, Using a Finite Element Method (Springer New York, 1999), pp. 45–70.

Snyder, A. Z.

A. T. Eggebrecht, S. L. Ferradal, A. Robichaux-Viehoever, M. S. Hassanpour, H. Dehghani, A. Z. Snyder, T. Hershey, and J. P. Culver, “Mapping distributed brain function and networks with diffuse optical tomography,” Nat. Photonics 8(6), 448–454 (2014).
[Crossref]

Srinivasan, S.

H. Dehghani, M. E. Eames, P. K. Yalavarthy, S. C. Davis, S. Srinivasan, C. M. Carpenter, B. W. Pogue, and K. D. Paulsen, “Near infrared optical tomography using nirfast: Algorithm for numerical model and image reconstruction,” Commun. Numer. Meth. Engng. 25(6), 711–732 (2009).
[Crossref]

Stachel, H.

L. Ahnen, H. Stachel, S. Kleiser, C. Hagmann, J. Jiang, A. Kalyanov, S. Lindner, M. Wolf, and S. Sanchez, Development and Validation of a Sensor Prototype for Near-Infrared Imaging of the Newborn Brain (Springer International Publishing, 2017), pp. 163–168.

Sudakou, A.

Taroni, P.

E. Ferocino, A. Pifferi, S. Arridge, F. Martelli, P. Taroni, and A. Farina, “Multi simulation platform for time domain diffuse optical tomography: An application to a compact hand-held reflectance probe,” Appl. Sci. 9(14), 2849 (2019).
[Crossref]

Toronov, V.

Torricelli, A.

Tosi, A.

Uddin, K. M. S.

K. M. S. Uddin and Q. Zhu, “Reducing image artifact in diffuse optical tomography by iterative perturbation correction based on multiwavelength measurements,” J. Biomed. Opt. 24(5), 056005 (2019).
[Crossref]

Wojtkiewicz, S.

Wolf, M.

C. Zhang, S. Lindner, I. M. Antolovic, M. Wolf, and E. Charbon, “A cmos spad imager with collision detection and 128 dynamically reallocating tdcs for single-photon counting and 3d time-of-flight imaging,” Sensors 18(11), 4016 (2018).
[Crossref]

J. Jiang, L. Ahnen, A. Kalyanov, S. Lindner, M. Wolf, and S. S. Majos, “A new method based on graphics processing units for fast near-infrared optical tomography,” Adv. Exp. Med. Biol. 977, 191–197 (2017).

J. Jiang, M. Wolf, and S. S. Majos, “Fast reconstruction of optical properties for complex segmentations in near infrared imaging,” J. Mod. Opt. 64(7), 732–742 (2017).
[Crossref]

J. Jiang, A. di Costanzo, S. Lindner, M. Wolf, and A. Kalyanov, “Tracking objects in a diffusive medium with time domain near infrared optical tomography,” in Biophotonics Congress: Biomedical Optics 2020 (Translational, Microscopy, OCT, OTS, BRAIN), (Optical Society of America, 2020), p. JTu3A.18.

A. Kalyanov, J. Jiang, S. Lindner, L. Ahnen, A. Di Costanzo, J. Mata Pavia, S. Sanchez Majos, and M. Wolf, “Time domain near-infrared optical tomography with time-of-flight spad camera: The new generation,” Biophotonics Congress: Biomedical Optics Congress 2018 (2018).

S. Lindner, C. Zhang, I. M. Antolovic, A. Kalyanov, J. Jiang, L. Ahnen, A. di Costanzo, J. M. Pavia, S. S. Majos, E. Charbon, and M. Wolf, “A novel 32 × 32, 224 mevents/s time resolved spad image sensor for near-infrared optical tomography,” in Biophotonics Congress: Biomedical Optics Congress 2018 (Microscopy/Translational/Brain/OTS), (Optical Society of America, 2018), p. JTh5A.6.

L. Ahnen, H. Stachel, S. Kleiser, C. Hagmann, J. Jiang, A. Kalyanov, S. Lindner, M. Wolf, and S. Sanchez, Development and Validation of a Sensor Prototype for Near-Infrared Imaging of the Newborn Brain (Springer International Publishing, 2017), pp. 163–168.

Wong, J. W.

Yalavarthy, P. K.

H. Dehghani, M. E. Eames, P. K. Yalavarthy, S. C. Davis, S. Srinivasan, C. M. Carpenter, B. W. Pogue, and K. D. Paulsen, “Near infrared optical tomography using nirfast: Algorithm for numerical model and image reconstruction,” Commun. Numer. Meth. Engng. 25(6), 711–732 (2009).
[Crossref]

Zanoletti, M.

Zhang, C.

C. Zhang, S. Lindner, I. M. Antolovic, M. Wolf, and E. Charbon, “A cmos spad imager with collision detection and 128 dynamically reallocating tdcs for single-photon counting and 3d time-of-flight imaging,” Sensors 18(11), 4016 (2018).
[Crossref]

S. Lindner, C. Zhang, I. M. Antolovic, A. Kalyanov, J. Jiang, L. Ahnen, A. di Costanzo, J. M. Pavia, S. S. Majos, E. Charbon, and M. Wolf, “A novel 32 × 32, 224 mevents/s time resolved spad image sensor for near-infrared optical tomography,” in Biophotonics Congress: Biomedical Optics Congress 2018 (Microscopy/Translational/Brain/OTS), (Optical Society of America, 2018), p. JTh5A.6.

Zhao, Q.

Zhu, Q.

K. M. S. Uddin and Q. Zhu, “Reducing image artifact in diffuse optical tomography by iterative perturbation correction based on multiwavelength measurements,” J. Biomed. Opt. 24(5), 056005 (2019).
[Crossref]

Adv. Exp. Med. Biol. (1)

J. Jiang, L. Ahnen, A. Kalyanov, S. Lindner, M. Wolf, and S. S. Majos, “A new method based on graphics processing units for fast near-infrared optical tomography,” Adv. Exp. Med. Biol. 977, 191–197 (2017).

Appl. Opt. (2)

Appl. Sci. (1)

E. Ferocino, A. Pifferi, S. Arridge, F. Martelli, P. Taroni, and A. Farina, “Multi simulation platform for time domain diffuse optical tomography: An application to a compact hand-held reflectance probe,” Appl. Sci. 9(14), 2849 (2019).
[Crossref]

Biomed. Opt. Express (3)

Breast Cancer Res. (1)

V. Ntziachristos and B. Chance, “Probing physiology and molecular function using optical imaging: applications to breast cancer,” Breast Cancer Res. 3(1), 41–46 (2000).
[Crossref]

Commun. Numer. Meth. Engng. (1)

H. Dehghani, M. E. Eames, P. K. Yalavarthy, S. C. Davis, S. Srinivasan, C. M. Carpenter, B. W. Pogue, and K. D. Paulsen, “Near infrared optical tomography using nirfast: Algorithm for numerical model and image reconstruction,” Commun. Numer. Meth. Engng. 25(6), 711–732 (2009).
[Crossref]

IEEE Signal Process. Mag. (1)

D. A. Boas, D. H. Brooks, E. L. Miller, C. A. DiMarzio, M. Kilmer, R. J. Gaudette, and Z. Quan, “Imaging the body with diffuse optical tomography,” IEEE Signal Process. Mag. 18(6), 57–75 (2001).
[Crossref]

Inverse Probl. (1)

S. R. Arridge, “Optical tomography in medical imaging,” Inverse Probl. 15(2), R41–R93 (1999).
[Crossref]

J. Biomed. Opt. (1)

K. M. S. Uddin and Q. Zhu, “Reducing image artifact in diffuse optical tomography by iterative perturbation correction based on multiwavelength measurements,” J. Biomed. Opt. 24(5), 056005 (2019).
[Crossref]

J. Mod. Opt. (1)

J. Jiang, M. Wolf, and S. S. Majos, “Fast reconstruction of optical properties for complex segmentations in near infrared imaging,” J. Mod. Opt. 64(7), 732–742 (2017).
[Crossref]

Nat. Photonics (1)

A. T. Eggebrecht, S. L. Ferradal, A. Robichaux-Viehoever, M. S. Hassanpour, H. Dehghani, A. Z. Snyder, T. Hershey, and J. P. Culver, “Mapping distributed brain function and networks with diffuse optical tomography,” Nat. Photonics 8(6), 448–454 (2014).
[Crossref]

Opt. Express (2)

Phys. Med. Biol. (1)

E. M. Hillman, J. C. Hebden, M. Schweiger, H. Dehghani, F. E. Schmidt, D. T. Delpy, and S. R. Arridge, “Time resolved optical tomography of the human forearm,” Phys. Med. Biol. 46(4), 1117–1130 (2001).
[Crossref]

Sensors (1)

C. Zhang, S. Lindner, I. M. Antolovic, M. Wolf, and E. Charbon, “A cmos spad imager with collision detection and 128 dynamically reallocating tdcs for single-photon counting and 3d time-of-flight imaging,” Sensors 18(11), 4016 (2018).
[Crossref]

Other (5)

S. R. Arridge and M. Schweiger, A General Framework for Iterative Reconstruction Algorithms in Optical Tomography, Using a Finite Element Method (Springer New York, 1999), pp. 45–70.

L. Ahnen, H. Stachel, S. Kleiser, C. Hagmann, J. Jiang, A. Kalyanov, S. Lindner, M. Wolf, and S. Sanchez, Development and Validation of a Sensor Prototype for Near-Infrared Imaging of the Newborn Brain (Springer International Publishing, 2017), pp. 163–168.

S. Lindner, C. Zhang, I. M. Antolovic, A. Kalyanov, J. Jiang, L. Ahnen, A. di Costanzo, J. M. Pavia, S. S. Majos, E. Charbon, and M. Wolf, “A novel 32 × 32, 224 mevents/s time resolved spad image sensor for near-infrared optical tomography,” in Biophotonics Congress: Biomedical Optics Congress 2018 (Microscopy/Translational/Brain/OTS), (Optical Society of America, 2018), p. JTh5A.6.

A. Kalyanov, J. Jiang, S. Lindner, L. Ahnen, A. Di Costanzo, J. Mata Pavia, S. Sanchez Majos, and M. Wolf, “Time domain near-infrared optical tomography with time-of-flight spad camera: The new generation,” Biophotonics Congress: Biomedical Optics Congress 2018 (2018).

J. Jiang, A. di Costanzo, S. Lindner, M. Wolf, and A. Kalyanov, “Tracking objects in a diffusive medium with time domain near infrared optical tomography,” in Biophotonics Congress: Biomedical Optics 2020 (Translational, Microscopy, OCT, OTS, BRAIN), (Optical Society of America, 2020), p. JTu3A.18.

Cited By

OSA participates in Crossref's Cited-By Linking service. Citing articles from OSA journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. (a) Schematics of Pioneer system; and (b) picture of the Pioneer probe placed on a cylindrical silicone phantom.
Fig. 2.
Fig. 2. Laser-SPAD response functions at four sample wavelengths measured in reflection mode where the laser light was pointed to a white paper and the reflected light was captured by the SPAD camera; Note that the wavelength dependent dispersion in time, and additional time delays caused by different fiber lengths were eliminated by numerically applying pre-measured fixed time shifts to the TD data. The functions are so similar that with wavelength ratios they can be calibrated out easily.
Fig. 3.
Fig. 3. (a) The cylindrical silicone phantom used in the experiment has a diameter of 114 mm and a height of 55 mm and both two small spheres have a radius of 5 mm. (b) The two inclusions were embedded at 10 mm and 15 mm, respectively and the lateral distance between them was 10 mm. (c) Mesh created in simulation for image reconstruction. Note that we made it smaller than the actual volume in (a): cylinder of diameter 60 mm and height 35 mm. 221 detectors were selected within a circle of diameter = $\sim$17.5 mm and all 11 sources were utilized.
Fig. 4.
Fig. 4. Reconstruction for the phantom of two inclusions at 689 nm and 725 nm. Crosssection of (a)(e)(i)(m) ground truth (GT) and (b)(g)(j)(n) reconstructed results for simulation (Sim), (c)(g)(k)(o) non-calibrated reconstruction and (d)(h)(l)(p) auto-calibrated reconstruction (MeasC) for the experimental TD data at two depths 10 mm and 15 mm; Note that the ground-truth images were generated with NIRFAST.
Fig. 5.
Fig. 5. ToF histograms from three pixels (locations shown in the inset) for the phantom measurement at 689 nm. The time bin size is 48.8 ps.
Fig. 6.
Fig. 6. One directional distribution of reconstructed $\mu _a$ (a) at depth of 10 mm and (b) 15 mm for 689 nm and (c) 10 mm and (d) 15 mm for 725 nm for the ground truth (GT), simulation (Sim), non-calibrated measurement (Meas) and auto-calibrated measurement (MeasC).
Fig. 7.
Fig. 7. Mean squared error (MSE) and contrast-to-noise ratio (CNR) for the reconstructed results at $689 nm$ and $725 nm$ for simulation (Sim), non-calibrated measurement (Meas) and auto-calibrated measurement (MeasC).

Tables (1)

Tables Icon

Table 1. Optical properties of the silicone phantom

Equations (16)

Equations on this page are rendered with MathJax. Learn more.

[ κ ( r ) + μ a ( r ) + 1 c 0 ( r ) t ] ϕ ( r , t ) = q ( r , t ) ,
ϕ ( r , t ) + 2 ζ ( c 0 ) κ ( r ) ϕ ( r , t ) ν = 0 , r Ω ,
[ κ ( r ) + μ a ( r ) + i ω c 0 ( r ) ] Φ ( r , ω ) = Q ( r , ω ) ,
μ = a r g min μ { Φ S ( μ ) Φ M 2 2 + Γ ( μ ) } ,
F ~ ( ω ) = F [ α f ( t ) g ( t ) ] = α F ( ω ) G ( ω ) ,
R i ~ ( ω ) = F ~ ( ω ) λ i F ~ ( ω ) λ 0 = α F ( ω ) λ i G ( ω ) λ i α F ( ω ) λ 0 G ( ω ) λ 0 = F ( ω ) λ i F ( ω ) λ 0 = R i ( ω ) .
μ λ i , λ 0 = a r g min μ λ i , λ 0 { R i S ( μ λ i , λ 0 ) R i M 2 2 + Γ ( μ λ i , λ 0 ) } .
χ 2 = j = 1 N m ( ( R 1 S ) j ( R 1 M ) j ) 2 + β k = 1 N n ( ( μ λ 0 , λ 1 ) k ( μ λ 0 , λ 1 ) 0 ) 2 .
δ μ λ 0 , λ 1 = ( J R T J R + β I ) 1 J R T δ R 1 ,
( l n | R 1 | ) μ a λ 0 = ( l n I λ 1 l n I λ 0 ) μ a λ 0 = ( ln I λ 0 ) μ a λ 0 ,
( l n | R 1 | ) μ a λ 1 = ( l n I λ 1 l n I λ 0 ) μ a λ 1 = ( ln I λ 1 ) μ a λ 1 ,
θ ( R 1 ) μ a λ 0 = ( θ λ 1 θ λ 0 ) μ a λ 0 = ( θ λ 0 ) μ a λ 0 ,
θ ( R 1 ) μ a λ 1 = ( θ λ 1 θ λ 0 ) μ a λ 1 = ( θ λ 1 ) μ a λ 1 .
μ λ i , λ 0 = a r g min μ λ i , λ 0 { F 1 S ( μ λ i , λ 0 ) F 1 M 2 2 + Γ ( μ ) } .
M S E = 1 n i = 1 n ( μ a i μ ^ a i ) 2
C N R = 10 × log 10 { i = 1 n ( μ ^ a i ) 2 i = 1 n ( μ a i μ ^ a i ) 2 }

Metrics