## Abstract

Recently introduced angular-memory-effect based techniques enable non-invasive imaging of objects hidden behind thin scattering layers. However, both the speckle-correlation and the bispectrum analysis are based on the statistical average of large amounts of speckle grains, which determines that they can hardly access the important information of the point-spread-function (PSF) of a highly scattering imaging system. Here, inspired by notions used in astronomy, we present a phase-diversity speckle imaging scheme, based on recording a sequence of intensity speckle patterns at various imaging planes, and experimentally demonstrate that in addition to being able to retrieve the image of hidden objects, we can also simultaneously estimate the pupil function and the PSF of a highly scattering imaging system without any guide-star nor reference.

© 2017 Optical Society of America

## 1. Introduction

The interaction between light and complex samples with inhomogeneous refractive index in many imaging scenarios induces light scattering, which is always seen as an obstacle for imaging objects hidden inside or behind such samples and makes direct observation impossible, instead, generates a complex speckle pattern [1]. In recent years, wavefront shaping techniques have emerged as a powerful tool for imaging hidden objects or focusing through highly scattering media by controlling the incident light [2–12]. However, these techniques are complex and lengthy, since they require a detector or an optical/acoustical probe in the plane of interests. A recent breakthrough reported by Bertolotti et al. [13] avoided the use of guide-stars and enabled non-invasive imaging through thin scattering layers by exploiting the inherent angular-correlations, known as “memory effect” in the scattered speckle patterns [14, 15]. When an object is placed within the range determined by the angular-memory-effect, the angular signal is the convolution between the object and the system’s point-spread-function (PSF), which is a highly complex speckle pattern, as generated by any light point source on the object. Since the autocorrelation of the PSF is close to a $\delta $ function, the Fourier-amplitude of object is retrieved from a large speckle pattern (i.e. sufficient speckle grains) by calculating its autocorrelation, and the lost phase information is recovered via an iterative phase-retrieval algorithm [16]. Katz et al. [17] put forward a single-shot approach of the aforementioned concept, inspired by astronomical techniques in which they regarded the scattering imaging system as an incoherent imaging system, and directly obtain the signal on a camera. Bispectrum analysis [18] also allows imaging from a single image in the same scenario, by exploiting the fact that the Fourier-phase of an object can be deterministically extracted from a large speckle pattern by relying on the property that the bispectrum of the system’s PSF is real-valued [19]. Although the speckle-correlation and the bispectrum analysis can retrieve hidden objects, they cannot directly access the scatterer’s exact phase distortion, i.e. the PSF, since both methods are based on the idea of statistical average and the influence of PSF is eliminated or ignored in the reconstruction processes. Alternatively, if a light point source is present, one can measure the intensity PSF of the highly scattering imaging system and then perform image reconstruction by deconvolution [20]. However, it is inconvenient and impractical to introduce such a guide-star in most real applications, e.g. in biomedical imaging. Yet, the prospect of accessing not only the object, but the scattering layer’s properties, and the exact PSF, would be highly beneficial for non-invasive imaging, for instance to image more complex objects.

In this work, we experimentally demonstrate a non-invasive imaging scheme based on angular-memory-effect. Inspired by the phase-diversity technique used in astronomy imaging [21], a sequence of speckle patterns from multiple planes are sequentially collected by a translational camera. From only a small region of such speckle patterns sequence, we can jointly retrieve the image of incoherently illuminated objects, hidden behind a thin, but highly scattering layer, and estimate the local PSF of the highly scattering imaging system, without any reference nor additional experimental constraint on the object side. In addition, our method is straightforward to implement and to the best of our knowledge, is the first experimental demonstration that phase-diversity technique can be used not only in the aberration regime, but also in the highly scattering cases, as was envisioned in the earlier work of phase-diversity [21].

## 2. Principle and numerical simulations

The principle of the experiments for phase-diversity speckle imaging and a numerical simulation are presented in Fig. 1. An object, hidden at a distance *u* behind a highly scattering medium, is illuminated by a spatially incoherent and narrowband source. The scattered light is recorded by a camera, which is initially placed at a distance *v* from the other side of the scattering medium. Since the object is located within the range determined by angular-memory-effect, each point on the object generates a nearly identical, but shifted, random speckle pattern on the camera. The camera image is a simple incoherent superposition of these shifted random speckle patterns, generated by all the points on the object. Therefore, it allows the system in Fig. 1(a) to be regarded as an incoherent imaging system with a shift-invariant PSF, i.e. the identical random speckle pattern, and the camera image is the convolution of the object and the PSF [17]. In order to jointly retrieve the image of the object and estimate the PSF of the system, we continuously change the position of the camera with a fixed interval $\delta $ and collect the camera image ${I}_{n}\left(x\right)$ at each position [Fig. 1(b)], which reads:

*N*denotes the number of all the diversity images. $O\left(x\right)$ is the image of object, ${S}_{n}\left(x\right)$ denote the corresponding PSFs of the system when camera is placed at different positions and ${w}_{n}\left(x\right)$ is a noise term. Since the speckle pattern is very large (up to millions of speckle grains) the full problem is numerically extremely challenging. We therefore apply a phase-diversity algorithm to various sub-regions of the speckle patterns, and retrieve simultaneously the image of the hidden object [Fig. 1(c)] and the corresponding sub-PSF [Fig. 1(d)] for the selected sub-region. Detailed information of the method is given below and in Fig. 2.

To verify the uniqueness of the solution of the estimated sub-PSF via the phase-diversity speckle imaging method, two other simulated hidden objects, i.e. the “letter H” and the “double stars” are respectively used to estimate the sub-PSF of the same scattering imaging system, as is used for “digit 2”. The corresponding results are shown in Fig. 3, in which the first row shows the retrieved images of different hidden objects, and the sub-PSFs estimated by different objects are shown in the second row, in which the three recovered sub-PSFs are almost the same, thus demonstrating the iterative processes of the reconstructions converge to a unique solution, independently of the hidden objects.

As an early technique in astronomy imaging, phase-diversity has been used to estimate the unknown phase aberrations and eliminate the phase distortions caused by the atmosphere turbulence, often by taking one or multiple snapshots around the focus, producing a set of blurred, but recognizable images for weak aberrations [21]. However, in highly scattering cases, only a 3D propagating speckle pattern is produced and sampled by the camera, containing no obvious visual information on the hidden object. The illuminated area on the scattering layer, or any aperture placed between the scattering layer and the camera is regarded as the pupil of a highly scattering imaging system, and its generalized pupil function can be defined. When introducing diversity, a set of generalized pupil functions of the highly scattering imaging system can still be written as follows in its most general form (See derivation in Appendix A):

*f*is the position in the pupil, $\varphi \left(f\right)$ is the unknown random phases on the pupil, and ${\theta}_{n}\left(f\right)$denotes the known phase function, corresponding to the

*n*-th diversity image. We consider that the modulus of the generalized pupil function $\left|{H}_{n}\left(f\right)\right|$ is 1 over a round pupil and 0 elsewhere. The effective pupil diameter is determined by the lateral decorrelation length of speckle grains, $\Delta X=1.0\lambda \left(v/D\right)$ [1], where

*D*denotes the diameter of the aperture stop placed between the scattering medium and the camera. As was envisioned in Ref [21], phase-diversity technique cannot only be used to solve the problem of weak phase distortions in the aberration regime, but also work on any other kind of phase fluctuations, e.g. in the highly scattering cases, in which the totally random and complex phase function $\varphi \left(f\right)$ can also be parametrized by the suitable basis functions without the loss of generality and the parameters could represent the point-by-point phase values, in which case the basis functions would be the Kronecker delta functions. In order to determine the complex phase values, multiple diversity images are required to provide sufficient information. For simplicity and accessibility, we use a simple axial translation of the camera, which corresponds to the known parabolic diversity phase function of defocus [22]:

*v*is the imaging distance and $\left|f\right|$ is the distance to the center of the pupil.

If we assume that the additive white Gaussian noise is the dominant noise (e.g. thermal noise) in the speckle patterns on the camera, a cost function can be found to estimate the local phase values on the pupil of the system via a maximum-likelihood estimation [21, 23]:

The nonlinear optimization algorithm could be used to estimate the random phases $\varphi \left(f\right)$ via the cost function of Eq. (4). In order to ensure a relatively high efficiency of the optimization, we choose the quasi-Newton method, which retains the good convergence property as Newton’s method, yet avoids its heavy computation of Hessian matrix [24]. The search direction in quasi-Newton method can be found by ${d}_{i}=-{H}_{i}\nabla g\left[{\varphi}_{i}\left(f\right)\right]$, where $\nabla (\cdot )$is the gradient operator and *i* denotes the index of iteration.$g\left[{\varphi}_{i}\left(f\right)\right]=-L\left[{\varphi}_{i}\left(f\right)\right]$, which is based on the fact that maximizing the likelihood function of Eq. (4) is equivalent to minimizing its opposite value. ${H}_{i}$ is a computed matrix that is an approximation to the inverse of the Hessian matrix of *g* at the current iteration. Many methods are available to update ${H}_{i}$ in each iteration and we choose the formula of Broyden-Fletcher-Goldfarb-Shanno [24], which is considered the most effective in general cases. ${\varphi}_{0}\left(f\right)$ is the initial guess of the unknown random phases and ${H}_{0}$ is the initial value of the computed matrix that can be initialized as an identity matrix. A line search method is used to update the estimation with a form of ${\varphi}_{i+1}\left(f\right)={\varphi}_{i}\left(f\right)+{\alpha}^{*}{d}_{i}$, where ${\alpha}^{*}$ is the step length parameter minimizing $g\left[{\varphi}_{i}\left(f\right)+\alpha {d}_{i}\right]$. The final estimation of the random phase values can be acquired after several iterations.

The PSF of the system is the squared modulus of the coherent impulse response function, which is the inverse Fourier transform of the generalized pupil function (See derivation in Appendix A):

where ${F}^{-1}\{\cdot \}$implies the inverse Fourier transform. The image of the hidden object is simultaneously retrieved from [21, 23]:Interestingly, since the phase-diversity speckle imaging method is not based on the statistical average of speckle grains, a small region of speckle pattern only, as marked by the dashed box in Fig. 2, is sufficient to retrieve the hidden object and simultaneously estimate a sub-PSF of the system, provided it is larger than the object autocorrelation. Larger sub-images may certainly enable the estimation of a more complex sub-PSF and objects, since the sub-image size in turn determines the largest size of hidden objects that can be recovered by deconvolution method. However, increasing the sub-image size strongly increase the computational complexity and the number of diverse images required, while would not improve the resolution or the contrast of the final image. Considering a sub-region of the speckle pattern gives rise to an inevitable problem of the discontinuities of the edges of the sub-image [25]. The discontinuities produce artifacts in the Fourier transforms, which are implemented as two-dimensional, discrete fast Fourier transforms. These artifacts affect the accuracy of the reconstruction of the object and the estimation of the sub-PSF. The way in our method to overcome this problem is to smooth the edges of the sub-image by adding an apodization function (e.g. Hanning window). It is worth noting that the lateral diffusion of scattered light during axial propagation would cause a boundary effect. More specifically, when moving the camera in the axial direction, the scattered light propagates and therefore diffuses laterally (together with a global translation for off-axis regions). This means that because we keep the size of phase-diversity images, there will always be a mismatch between the propagated images and the captured sub-images. In order to minimize the influence of the boundary effect, the selection of the sub-image is advantageously done near the optical axis and the maximum translation of camera from the initial position should be limited. It depends on the number of speckle grains contained in the sub-image and on the amount of translation. As an example, in our numerical simulation of Figs. 1 and 3, the lateral decorrelation length is 2.25 pixels, and there are 70 pixels in one dimension of sub-images, corresponding to around 31 speckle grains. 19 diversity images are collected corresponding in total to 6 axial decorrelation length, meaning in our case that we sample 6 speckle grains along the axial direction. This number is well below the number of speckle grains contained in one lateral dimension of the sub-image and the apodization window, ensuring a limited boundary effect.

## 3. Experimental results

Figure 4 is the optical setup of the experimental demonstration of this concept. The light source is a spatially incoherent light-emitting diode (Thorlabs, M625L3), whose nominal wavelength is 625nm and bandwidth is 18nm, filtered by a narrow band-pass filter (Andover, 633FS02-50, 1.0+/−0.2nm) mounted on the camera (Andor, ZYLA-5.5-USB 3.0), to ensure the high contrast of the speckle patterns. The incoherent light illuminates the object, digit “2” (Edmund, 1951 USAF Negative Target, 2” × 2”, ~350um), which is shown in Fig. 5(a) and hidden ~60cm behind a scattering medium (Edmund, Ground Glass Diffuser). The illuminated area on the scattering medium is adjusted by a contiguous iris with a diameter of 5.21mm to control the size of speckle grains. The camera is initially placed at a distance of ~12cm in front of the scattering medium to collect the scattered light and a 50mm linear translation stage (Thorlabs, DDSM50) is used to move the camera.

Figure 5(b) shows some of the 19 acquired frames of raw diversity camera images with a fixed interval $\delta =\Delta Z/3$, where $\Delta Z=7.1\lambda {\left(v/D\right)}^{2}$ is the axial decorrelation length [1]. The camera images are spatially normalized for the slowly varying envelope of the scattered light pattern, and then smoothed by a Gaussian kernel. We select three independent groups of sub-images around the center of the processed camera images. Each sub-image contains 84 camera pixels in one dimension and is smoothed by a Hanning window to solve the problem of discontinuities of edges in discrete Fourier transforms and limit the boundary effect. A quasi-Newton algorithm from Matlab Optimization Toolbox is used to solve the optimization problem in Eq. (4) to reconstruct each group of sub-images, respectively. The initial guess of the local phase values is zero and the parameter $\sigma $ is 10^{−5}, which is selected by trial and error. If $\sigma $ is too small, the reconstructions degrade because of the amplification of error; while if it is too large, it overly smooths the reconstructions. The optimized random phases are estimated after 1200 iterations, as is shown in the first column of Fig. 5(c). The estimated sub-PSFs of the system and the retrieved images of hidden object are naturally obtained via Eq. (5) and Eq. (6).

As presented, our phase-diversity speckle imaging method cannot only image the hidden object behind a thin, but complex scattering layer, as the speckle-correlation and the bispectrum analysis, but also estimate the pupil function and the PSF of a highly scattering imaging system without any reference. Interestingly as long as the PSF is acquired, we can directly and efficiently recover the image of other hidden objects from the complex speckle pattern via simple deconvolutions, instead of repeating foregoing procedures or using other complex imaging methods.

As a demonstration of this concept, after estimating the PSF, we replace digit “2” with another object, digit “3” (~350um), which is from the same USAF target and shown in Fig. 6(a). The speckle pattern for digit “3” shown in Fig. 6(b) is captured by the camera at the same initial position as digit “2”. After implementing the same preprocessing to the raw camera image, we select the sub-images at the same areas, where the sub-PSFs have been estimated. The Hanning window is used to smooth each sub-image, as is shown in the first column of Fig. 6(c). Then, deconvolutions are implemented by applying the Lucy-Richardson method with the iterations between 30 and 50, using codes from the Matlab Image Processing Toolbox. Deconvolution results of the three groups are shown in Fig. 6(c), which demonstrates the feasibility of the deconvolution method, and moreover, verifies the validity of the estimated sub-PSFs via our phase-diversity speckle imaging method. Beyond this simple demonstration, more complex objects, and even broadband or polychromatic objects, could be in principle retrieved, since the local pupil phase function is known.

The different regions of speckle pattern are uncorrelated, which determines the local PSF for one sub-image of the speckle pattern is unique and the hidden object cannot be recovered by deconvolution with different local PSFs. As an example, we use the sub-PSFs estimated from different groups of sub-images to reconstruct only the sub-image of group 1, which is shown in Fig. 7(a). Figure 7(b) is the deconvolution result by using its corresponding sub-PSF estimated from group 1, while no information about the hidden object can be recovered if we use the other two different sub-PSFs to reconstruct the sub-image of group 1, as are respectively shown in Figs. 7(c) and 7(d).

## 4. Conclusion

In conclusion, we experimentally demonstrated a phase-diversity speckle imaging method, which allows non-invasive imaging of hidden objects behind a thin, but highly complex scattering layer, and can jointly estimate the pupil function and the PSF of a highly scattering imaging system without any reference. Since our method is not based on the statistical average of large amounts of speckle grains, only a small region of speckle pattern is sufficient, at the cost of multiple frames acquisition. The smallest dimension of the sub-image depends on the image of hidden objects, whose size can be roughly determined by the autocorrelation of the camera image, and on the amount of translation performed. Larger region certainly allows us to acquire a larger PSF, but on the other hand, it means more complex phase values on the pupil to determine, which increases the computational complexity. Furthermore, while we only obtain the local PSF within a sub-region of the speckle pattern, the process can easily be parallelized on multiple sub-apertures and a complete and high resolution reconstruction of the scatterer’s topography could in principle be achieved, as in ptychography. For this proof of concept, we used a simple quasi-Newton algorithm to solve the optimization problem, which is possible to be trapped in the local minima and fail to converge. Slightly changing the available information (e.g. the frames of speckle patterns) or using random initial guesses would be potential solutions. In addition, other advanced optimization algorithms are expected to perform more efficiently. To this end, we freely make available the source codes and experimental data to use by the scientific community, as shown in the Code 1 (Ref. [26]) and Dataset 1 (Ref. [27]).

## Appendix A Generalized pupil function in a highly scattering imaging system

Figure 8 is a schematic of the impulse response of a highly scattering imaging system. Before we define a generalized pupil function of a highly scattering imaging system, we first consider its impulse response, which is defined as $h\left(u,\text{\hspace{0.17em}}v;\epsilon ,\text{\hspace{0.17em}}\eta \right)$ created by a point source (i.e. a $\delta $ function) at the plane of coordinates $\left(\epsilon ,\text{\hspace{0.17em}}\eta \right)$ [22, 28]. Ignoring the time-dependent phase factor $\mathrm{exp}\left(j\omega t\right)$, and considering the paraxial approximation and dropping the constant phase factor, the field distribution on the front surface of the scattering layer can be written:

If only the intensity distribution is of interest, the quadratic phase factor outside the integral can be discarded, and the impulse response can be further simplified as follows:

Equation (12) demonstrates that the impulse response is the inverse Fourier transform of $H\left(x,\text{\hspace{0.17em}}y\right)$, which is the generalized pupil function introduced in the highly scattering imaging system and can be expressed:

*n*is the index of diversity image, varying from 0 to

*N*-1, and

*N*is the number of diversity images. $n\Delta z$ denotes the defocus distances at different times if the diversity images are captured with an equidistant interval $\Delta z$. ${\theta}_{n}\left(x,\text{\hspace{0.17em}}y\right)=\left[\frac{n\Delta z\pi}{\lambda {z}_{2}\left({z}_{2}+n\Delta z\right)}\right]\left({x}^{2}+{y}^{2}\right)$ denotes each defocus phase function. If we use the symbol

*f*to represent the position at the plane of scattering layer, instead of using the original coordinate $\left(x,\text{\hspace{0.17em}}y\right)$, Eq. (14) would be the same form used in Eq. (2), i.e. ${H}_{n}\left(f\right)=\left|{H}_{n}\left(f\right)\right|\mathrm{exp}\left\{i\left[\varphi \left(f\right)+{\theta}_{n}\left(f\right)\right]\right\}$.

## Funding

National Natural Science Foundation of China (NSFC) (61575154); European Research Council (ERC) (278025 and 724473); China Scholarship Council (CSC) (201506960026).

## Acknowledgments

The authors would like to thank Xueen Wang for the insightful discussions, and Huijuan Li for help with experiments. S. G. is a member of the Institut Universitaire de France.

## References and links

**1. **J. W. Goodman, *Speckle Phenomena in Optics: Theory and Applications* (Roberts and Company Publishers, 2007).

**2. **I. M. Vellekoop and A. P. Mosk, “Focusing coherent light through opaque strongly scattering media,” Opt. Lett. **32**(16), 2309–2311 (2007). [PubMed]

**3. **I. M. Vellekoop and C. M. Aegerter, “Scattered light fluorescence microscopy: imaging through turbid layers,” Opt. Lett. **35**(8), 1245–1247 (2010). [PubMed]

**4. **S. M. Popoff, G. Lerosey, R. Carminati, M. Fink, A. C. Boccara, and S. Gigan, “Measuring the transmission matrix in optics: an approach to the study and control of light propagation in disordered media,” Phys. Rev. Lett. **104**(10), 100601 (2010). [PubMed]

**5. **A. P. Mosk, A. Lagendijk, G. Lerosey, and M. Fink, “Controlling waves in space and time for imaging and focusing in complex media,” Nat. Photonics **6**, 283–292 (2012).

**6. **O. Katz, E. Small, and Y. Silberberg, “Looking around corners and through thin turbid layers in real time with scattered incoherent light,” Nat. Photonics **6**, 549–553 (2012).

**7. **T. Chaigne, J. Gateau, O. Katz, E. Bossy, and S. Gigan, “Light focusing and two-dimensional imaging through scattering media using the photoacoustic transmission matrix with an ultrasound array,” Opt. Lett. **39**(9), 2664–2667 (2014). [PubMed]

**8. **I. M. Vellekoop, “Feedback-based wavefront shaping,” Opt. Express **23**(9), 12189–12206 (2015). [PubMed]

**9. **P. Lai, L. Wang, J. W. Tay, and L. V. Wang, “Photoacoustically guided wavefront shaping for enhanced optical focusing in scattering media,” Nat. Photonics **9**(2), 126–132 (2015). [PubMed]

**10. **R. Horstmeyer, H. Ruan, and C. Yang, “Guidestar-assisted wavefront-shaping methods for focusing light into biological tissue,” Nat. Photonics **9**, 563–571 (2015). [PubMed]

**11. **S. Rotter and S. Gigan, “Light fields in complex media: Mesoscopic scattering meets wave control,” Rev. Mod. Phys. **89**, 015005 (2017).

**12. **J. Yoon, K. Lee, J. Park, and Y. Park, “Measuring optical transmission matrices by wavefront shaping,” Opt. Express **23**(8), 10158–10167 (2015). [PubMed]

**13. **J. Bertolotti, E. G. van Putten, C. Blum, A. Lagendijk, W. L. Vos, and A. P. Mosk, “Non-invasive imaging through opaque scattering layers,” Nature **491**(7423), 232–234 (2012). [PubMed]

**14. **S. Feng, C. Kane, P. A. Lee, and A. D. Stone, “Correlations and fluctuations of coherent wave transmission through disordered media,” Phys. Rev. Lett. **61**(7), 834–837 (1988). [PubMed]

**15. **I. Freund, M. Rosenbluh, and S. Feng, “Memory effects in propagation of optical waves through disordered media,” Phys. Rev. Lett. **61**(20), 2328–2331 (1988). [PubMed]

**16. **J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. **21**(15), 2758–2769 (1982). [PubMed]

**17. **O. Katz, P. Heidmann, M. Fink, and S. Gigan, “Non-invasive single-shot imaging through scattering layers and around corners via speckle correlations,” Nat. Photonics **8**, 784–790 (2014).

**18. **T. Wu, O. Katz, X. Shao, and S. Gigan, “Single-shot diffraction-limited imaging through scattering layers via bispectrum analysis,” Opt. Lett. **41**(21), 5003–5006 (2016). [PubMed]

**19. **A. W. Lohmann, G. Weigelt, and B. Wirnitzer, “Speckle masking in astronomy: triple correlation theory and applications,” Appl. Opt. **22**(24), 4028 (1983). [PubMed]

**20. **E. Edrei and G. Scarcelli, “Memory-effect based deconvolution microscopy for super-resolution imaging through scattering media,” Sci. Rep. **6**, 33558 (2016). [PubMed]

**21. **R. G. Paxman, T. J. Schulz, and J. R. Fienup, “Joint estimation of object and aberrations by using phase diversity,” J. Opt. Soc. Am. A **9**, 1072–1085 (1992).

**22. **J. W. Goodman, *Introduction to Fourier Optics* (Roberts and Company Publishers, 2005).

**23. **C. R. Vogel, T. Chan, and R. Plemmons, “Fast algorithms for phase diversity-based blind deconvolution,” Adaptive Optical System Technologies, Parts 1 and 2 **3353**, 994– 1005 (1998).

**24. **J. Nocedal and S. J. Wright, *Numerical Optimization*, 2nd ed. (Springer, 2006).

**25. **O. Von der Lühe, “Speckle imaging of solar small scale structure I. Methods,” Astron. Astrophys. **268**, 374–390 (1993).

**26. **T. Wu, “codes.zip,” figshare (2017), https://figshare.com/s/ac47b433be00ef4ef893.

**27. **T. Wu, “data.zip,” figshare (2017), https://figshare.com/s/4969d19518a7680d07ec.

**28. **A. K. Singh, D. N. Naik, G. Pedrini, M. Takeda, and W. Osten, “Exploiting scattering media for exploring 3D objects,” Light Sci. Appl. **6**, e16219 (2017).