Abstract
Conventional (CP) and Fourier (FP) ptychography have emerged as versatile quantitative phase imaging techniques. While the main application cases for each technique are different, namely lens-less short wavelength imaging for CP and lens-based visible light imaging for FP, both methods share a common algorithmic ground. CP and FP have in part independently evolved to include experimentally robust forward models and inversion techniques. This separation has resulted in a plethora of algorithmic extensions, some of which have not crossed the boundary from one modality to the other. Here, we present an open source, cross-platform software, called PtyLab, enabling both CP and FP data analysis in a unified framework. With this framework, we aim to facilitate and accelerate cross-pollination between the two techniques. Moreover, the availability in Matlab, Python, and Julia will set a low barrier to enter each field.
© 2023 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement
1. Introduction
Ptychography [1,2] has grown into a mature technique for x-ray, extreme ultraviolet (EUV), and electron microscopy. It has revolutionized synchrotron-based x-ray microscopy, where it improves upon previously existing scanning transmission x-ray microscopy (STXM) data analysis techniques [3–6]. Three major benefits of ptychography over STXM are: (1) decoupling of the illumination spot size from the achievable lateral resolution, (2) quantitative amplitude and phase contrast, and (3) access to wavefront diagnostics [7–11]. Similar benefits have subsequently been demonstrated for scanning transmission electron microscopes (STEMs) [12–14], where it recently produced micrographs at record-breaking resolution [15,16]. A parallel line of development is EUV laboratory-scale microscopy, where ptychography is a promising candidate for actinic inline metrology for lithography applications [17–19] and a tool for chemically-resolved microscopy [20–22]. In ptychography, a specimen is laterally scanned through a localized illumination, referred to as probe. A detector downstream of the specimen records a sequence of diffraction patterns. These observations lack phase information preventing direct inversion. Ptychography solves this problem by recording data from laterally overlapping specimen regions of interest. This acquisition scheme opens up the possibility for phase retrieval and simultaneous deconvolution of illumination and specimen information. Beyond operation with x-ray and electron radiation, ptychography has been implemented with extreme ultraviolet, visible, near-infrared, and terahertz radiation [19,23–26].
Fourier ptychography [27] follows a similar operational principle as (conventional) ptychography, denoted FP and CP, respectively, throughout this paper. In FP, a specimen is illuminated from different directions, typically steered by means of an LED array, which serves as a controllable condenser. A sequence of low-resolution bright and dark field images is recorded in a lens-based microscope. Changing the illumination direction amounts to shifting the object spectrum with respect to the pupil of the optical system. If the illumination direction is changed in such a way that two recorded images share information in the Fourier domain, phase retrieval techniques can be applied to separately reconstruct the object spectrum and the pupil of the optical system. Thus FP has three attractive features: (1) The low-resolution data can be stitched together to a large synthetic numerical aperture (NA), resulting in both a large field of view and high resolution. In contrast to most wide-field systems, FP thus does not trade-off resolution and field of view. (2) after conversion to real-space, the recovered object spectrum gives quantitative amplitude and phase maps of the sample; (3) the reconstructed pupil function enables aberration diagnostics of the optical system at hand [28–30]. While FP has mostly found applications in the visible domain, recent implementations using infrared radiation and x-rays have been reported [31,32].
1.1 Contribution
In both CP and FP, the recorded data jointly sample real and reciprocal space [33–37]. In CP, the probe serves as a real space window that selects local spatial frequency content. In FP, the pupil selects a low-resolution real space image from a localized Fourier space bandpass. In fact, the forward models of CP and FP are mathematically equivalent and the measured data cubes may be regarded as rotations of one another in phase space [36]. Although this equivalence is well-known, CP and FP have evolved into two separate communities with different algorithmic approaches and self-calibration techniques. Here, we report on a numerical data analysis toolbox, named PtyLab (code available online [38]), which places the equivalence of CP and FP at the center of its logical structure, resulting in three main contributions of this work:
- 1. Cross-modal: PtyLab allows to not only analyze CP and FP data, but also convert the same data set between the two domains. This flexible conversion between CP and FP leads to both physical insights as well as algorithmic cross-pollination of both domains. To our knowledge, PtyLab is the first ptychography code designed to be cross-modal, unifying the data analysis frameworks of CP and FP.
- 2. Multi-lingual: PtyLab is the first cross-platform ptychography code available in three programming languages, namely Matlab, Python, and Julia. Thus, it enables researchers with different programming backgrounds to communicate and exchange ideas based on a unified terminology and and code structure.
- 3. Open access: PtyLab is released together with various experimental data sets and accompanying hands-on tutorials, where the user is trained in practical data analysis. We hope that this contributes to standardized data analysis in both CP and FP.
In addition, PtyLab features a variety of algorithmic extensions as compared to currently available ptychography software packages. Some of these were previously reported by us and are now provided open source. This includes axial (zPIE) [39] as well as angle (aPIE) [40] correction engines, code to analyze ptychographic optical coherence tomography (POCT) data, efficient wave propagation algorithms both valid for high NA and polychromatic radiation, and detector subsampling (sPIE) [25,39–42]. Other novelties are reported here for the first time such as external reference wave ptychography. In addition, previously reported algorithms developed by other groups are included such as the extended ptychographic iterative engine (ePIE) [43], multislice (e3PIE) [44], mixed states [45], information multiplexing (PIM) [46], momentum acceleration (mPIE) [47], Tikhonov and total variation regularization [20,48,49], correlation-based lateral position correction [50], and orthogonal probe relaxation (OPR) [51]. In writing this manuscript, we pursued the goal of providing a concise overview of the various engines available to date in ptychography.
1.2 Related work
Most CP packages reported to date have focused on high performance computing, which is key for day-to-day user operation at large scale facilities, where beamtime is scarce and experimental feedback is needed quickly [52–59]. Another line of research has investigated the capabilities opened up by modern automatic differentiation (AD) and machine learning (ML) toolboxes [60,61]. AD frameworks offer flexibility as they simply require the user to specify a desired forward model, typically specified in the form of a suitable loss function and possible regularizers. This approach is convenient for the user as it dispenses with the need to derive challenging gradient expressions, the latter of which oftentimes involve complex-valued (Wirtinger) derivatives [62,63]. It is thus for instance straightforward to switch from one regularizer to another without analytically deriving and programming the underlying gradient expressions into the underlying software. ML approaches, in particular those based on neural networks, have been used to significantly speed up the reconstruction process, lower the sampling requirements in the raw data, and to embed denoising priors [64,65]. However, neural network approaches need to be trained based on data sets that have already been solved for the corresponding real space images. In Ref. [64] the neural network was trained based on the solution of an iterative solver. Moreover, training a neural network capable of solving ptychography problems is a memory-consumptive, large-scale computational challenge, that cannot be performed on small hardware architecture. We thus believe the need for memory-efficient but possibly slower iterative algorithms remains, despite the exciting possibilities opened up by neural networks [66]. From the above referenced work, we shortly describe some of the features of two prominent code projects, namely PtychoShelves [57] and PtyPy [54], to illustrate some of the different design choices made in PtyLab.
PtychoShelves [57] is a Matlab-based software package for ptychography, designed with large-scale synchrotron facilities in mind. Shelves refer to the modular coding framework representing bookshelves, from which desired books (e.g., detector module, reconstruction engine etc.) can be taken out and inserted into the processing pipeline. To provide data handling across synchrotron facilities worldwide, PtychoShelves supports commonly used X-ray detectors as well as instrument control software. Reconstructions are highly-optimized for Matlab-based GPU acceleration as well as CPU processing through reconstruction engines written in binary C++ code and parallelized through the OpenMP multiprocessing interface. The C++ code supports Difference Map [8] and Maximum Likelihood [48] engines, together with other features such as mixed state [45] and multi-slice ptychography [44]. A wider range of reconstruction features are available through Matlab-based GPU engines, including an iterative least-squares solver for maximum-likelihood ptychography [55], orthogonal-probe-relaxation [51], near-field ptychography [67], and position correction [68].
PtyPy [54] is an open-source ptychography framework written in Python. It follows the Python coding style and is therefore modularized and object-oriented. The physical models are abstracted, which results in readable and concise code interfaces at a user level. The key element is the so-called POD class (probe, object, diffraction), which holds the access rule for a single position, object mode, and probe mode. For parallelization of the reconstructions, PtyPy uses a message passing interface (MPI), which allows for flexible usage on standard office PCs, workstations, and clusters. MPI favors non-sequential reconstruction algorithms that can be parallelized (e.g. Difference Map [8] and Maximum Likelihood [48]). So far, a broad range of forward models are implemented (e.g. mixed state ptychography [45], near-field ptychography [67], and lateral position correction [68]). The PtyPy framework is actively developed and novel features (e.g. GPU acceleration) are constantly added.
Both PtychoShelves and PtyPy are powerful ptychography data analysis platforms, but their design for high-performance computing poses an entry barrier for simple, one-off reconstructions in an academic-lab setting. In such cases, rapid code prototyping and ease-of-use can be more desirable than highly-optimized data handling and reconstructions.
Unlike CP, FP has not seen the same wide-spread use within research institutions that resulted in well-developed and maintained coding platforms. The existing open-source FP codes in [69–74] exist mainly to supplement publications by providing only minimal working examples with limited functionality intended. Recently an attempt has been made to provide a Matlab-based FP reconstruction platform [75], which among other features provides raw data denoising, GPU processing, and LED misalignment correction.
Our goal here is to bridge the gap between CP and FP, thereby allowing for a cross-pollination of the two domains and a unified algorithmic framework. As compared to the above highlighted software packages PtyLab is less focused on high performance and distributed computing, but puts emphasis on providing an ptychography ecosystem for researchers interested in rapid prototyping and exchanging algorithms - across modalities and programming languages.
1.3 Outline
In Section 2 we revisit the idea of reciprocity, which formalizes the equivalence between CP and FP - the central idea for the unified software design in PtyLab. Section 3 details language-specific implementation details in Matlab, Python, and Julia. Section 4 serves as a comprehensive overview of the available forward models and the corresponding optimization algorithms. Practical features for scan grid optimization are described in section 6. PtyLab is released with various data sets and hands-on tutorials, which are described in section 7.
2. Implications of reciprocity for ptychography
One may think of the data sets recorded in ptychography in analogy to a musical score, where frequency information is prescribed at particular signatures in time. Once such a time-frequency, or phase-space, representation is given in form of a musical score, we can convert this information into either the time or the frequency domain. For example, we can digitally record a concert and Fourier transform the resulting signal. These processing steps would involve the temporal waveform and its frequency spectrum, respectively. Likewise, ptychography jointly samples real and reciprocal space representations of a signal, where for simplicity we ignore the additional complication of phase retrieval for the moment. The goal of ptychography is to convert partial phase-space information of a signal into a pure space or a pure spatial frequency representation. Physically, the phase-space description of ptychography [33] is intimately connected to the principle of reciprocity [77], which states that by interchanging the illumination and detection direction in an optical system identical data sets can be observed.
We would like to distinguish two types of reciprocity in ptychography. Type-I reciprocity refers to the ability to design separate CP and FP optical systems, both of which produce 4D data cubes which are essentially related by a phase space rotation [36]. In addition, we define type-II reciprocity, which refers to the ability to algorithmically convert a 4D data cube from one domain to the other. Thus type-I reciprocity is essentially a statement about the ability to design different CP and FP hardware embodiments producing the same 4D experimental data cube. Type-II reciprocity is a matter of data processing: once a 4D data cube is measured in either a CP or a FP representation, it can be converted into the other respective domain and subsequently reconstructed.
Figure 1 illustrates this idea of reciprocity in the context of ptychography. In CP (Fig. 1(a)), an object is laterally translated against a typically focused beam. This probe localizes the origin of the measured signal in real-space (scan coordinates $\boldsymbol {s}$). A sequence of diffraction patterns is captured on a pixelated detector, which is assumed here to be located in the far field (spatial frequency $\boldsymbol {u}$). Hence the data cube in CP consists of a sequence of angular scattering maps, each corresponding to a particular real-space specimen region localized to the spatial extent of the incident probe (Fig. 1(b)). In FP, an object is held at a fixed location under variation of the illumination angle (Fig. 1(d)). Thus the angular spectrum emanating from the sample is shifted over the finite-sized lens aperture. The pupil serves to bandpass filter the object’s spatial frequency spectrum, resulting in a data cube consisting of dark and bright field images (Fig. 1(c)). Thus the data cube in FP consists of a sequence of real-space images, each corresponding to a particular reciprocal space portion of the object spectrum localized to the passband admitted by the pupil of the imaging system. In summary, both flavors of ptychography sample real and reciprocal space.
We illustrate the aforementioned types of reciprocity in two ways: First, consider replacing the detector in the FP setup (Fig. 1(d)) with a pixelated light source. Turning on a single point source at a time and placing the detector in the far field of the sample, we can record a CP data set by scanning the point source location, as pointed out in a recent review article [2]. Of course, this hardware conversion ability faces practical limits imposed by the lens, which may cause a space-variant probe, but this complication is ignored here. Thus via hardware modification we can convert a CP experimental setup into and a FP system, which is a type-I reciprocity. Type-II reciprocity concerns the captured data itself and does not require any hardware modifications. It is possible to convert the measured data cube from one modality to the other. Suppose in Fig. 1(a) we scan the sample (for conceptual simplicity) on a regular raster grid and record a sequence of diffraction patterns. Then each pixel of the detector can be regarded as a traditional (single-pixel) scanning microscope data set. The data on each individual pixel may directly be reshaped into a two-dimensional real-space image, simply by permuting its dimension in correspondence with the scan trajectory. Practically, aperiodic translation trajectories and high NA effects require interpolation techniques. However, at low NA and using raster scan grids the data reshaping operation can be implemented with a single line of code (namely, a permute operation), converting for instance a CP data set into a sequence of low-resolution bright and dark field images, the latter of which constitutes the raw data for FP. While we described type-II reciprocity phenomenologically in this section, a mathematical proof of this conversion ability is provided in the appendix. The mathematical details also elucidate the correspondence between reconstructed quantities in CP and FP. We provide online tutorials that illustrate the conversion between CP and FP [38].
The ability to convert CP and FP data has a bearing on the computational complexity of inversion algorithms underlying ptychography. Suppose we are given a CP data cube consisting of diffraction patterns with $U^2$ pixels at $S^2$ scan points. A single iteration of a parallelized ptychography solver (for example difference map [8]) requires us to numerically propagate exit waves from all scan positions to the detector plane and back. The Fourier transform operations involved will have a computational complexity of $\mathcal {O}\left [S^{2}\cdot U^{2}\cdot \log \left (U\right )\right ]$ if we work in the CP domain, while it scales with $\mathcal {O}\left [U^{2}\cdot S^{2}\cdot \log \left (S\right )\right ]$ if we convert the data into the FP domain. The difference in the log-terms can result in a practical speed up, provided that the number of detector pixels per dimension $U$ and the number of scan positions per dimension $S$ is largely different.
In summary, utilizing type-II reciprocity is the central motivation for the design of PtyLab: CP and FP data can be converted into each other. A unified data analysis framework thus allows to migrate between the two modalities. A benefit of this data conversion ability is the applicability of diverse inversion algorithms and self-calibration routines in the domain in which they are most conveniently applied. Another benefit of reciprocity is the trade-off in computational complexity.
3. Code structure
In this section we describe structural workflow in PtyLab. Our overall goal is to provide a code that enables flow of algorithmic ideas and rapid prototyping beyond the boundaries of modality (CP/FP) and programming language (Matlab/Python/Julia). Thus collaborators with different programming language preferences or from different communities (e.g. synchrotron-based CP versus visible light FP) can easily exchange code, without being perfectly literate in the other programming language. This approach comes at the benefit of a unified structure and naming convention but at times at the expense of certain language-specific programming conventions. The following subsection describes the common structure independent of programming language. Subsequently, we address differences in the Matlab, Python, and Julia implementations.
3.1 Platform- and modality-independent workflow
A high-level overview of the PtyLab’s workflow is illustrated in Fig. 2. Assuming CP or FP data is stored in a local folder on the user’s computer, the first step is preprocessing the data (see Fig. 2). Preprocessing converts the raw data into a PtyLab class object. The user specifies physical input (illumination wavelength), hardware properties (detector pixel pitch, binning), and geometric parameters (sample-detector distance [CP], lens magnification [FP], sample scan trajectory [CP], illumination angles [FP]). In addition, the user specifies a forward model that describes the propagation from end to end (CP: probe to detector, FP: pupil to detector). The preprocessing pipeline then writes a single or multiple PtyLab class objects into a hdf5 file (see Fig. 3) [78].
Second, the reconstruction script loads the preprocessed data. An initialization function generates uniform or radomized starting estimates for the probe (CP) or pupil (FP) and object (CP) or object spectrum (FP). In each the probe/pupil, object/object spectrum, and detector planes, meshgrids are calculated. These meshgrids depend on the specified physical, hardware, and geometrical parameters, as well as a forward model (propagator) that describes the mapping between probe (CP) or pupil (FP) and detector planes. A variety of propagation models can be specified, including angular spectrum (AS), scaled angular spectrum (SAS), Fresnel (Fresnel), Fraunhofer (Fraunhofer), and tilted Fresnel diffraction. The latter is relevant for non-coplanar reflection geometries and is typically performed only once on the raw diffraction data stack, provided no angle correction is performed (see subsection 5.7 for further details). Fraunhofer and Fresnel diffraction distinguish each other by an incorporation of a quadratic phase inside the propagation model. This quadratic phase may be absorbed into the probe/pupil function and can be compensated for post-reconstruction when a quantitative analysis of the reconstructed wavefront (CP) or pupil (FP) is of interest to the user.
3.2 Matlab structure
The Matlab code structure is shown in Fig. 4(a). Here an object of class PtyLab is generated. Its first-order properties (obj."firstOrder") contain the physics as well as the geometry of the experiment. In addition, there are second-order properties (obj."firstOrder"obj.params."secondOrder"), which are algorithmic parameters (obj."firstOrder"obj.params), monitoring control (obj.monitor), propagators as part of the forward model (obj.propagator), and file export parameters (obj.export). Certain notational conventions are noteworthy: the diffraction or image data is contained in obj.ptychogram, a term borrowed from time-frequency analysis where the raw data is often referred to as spectrogram [80]. The dimensions of (obj.ptychogram) are $({\tt y, x, numFrames})$, which is different from the Python convention (see subsection 3.3). The order along the first two dimensions stems from Matlab’s most convenient use when adhering to row-column convention. Similarly, obj.positions follows row-column convention.
3.3 Python structure
The Python structure is similar in idea to the Matlab structure, but is designed with a stronger emphasis on modularity. As shown in Fig. 4(b), the Python implementation contains five classes: ExperimentalData, Reconstruction, Monitor, Params, and Engines. Most but not all of these classes reflect second-order properties in the Matlab structure. The ExperimentalData class imports data from a preprocessed .hdf5 file, checks if all required parameters for a ptychographic reconstruction are included, and saves them into an instance that is immutable.
The Reconstruction class takes the ExperimentalData instance as the input, and creates a mutable instance containing attributes that are optimized during a reconstruction process, e.g. the probe/pupil, and the object, as well as attributes that are related to the optimizable parameters, e.g. the error, the coordinates and meshgrids. Note that in the Python implementation, the probe/pupil and the object are set as 6D arrays with the fixed axes [nlambda, nosm, npsm, nslice, row, col], which are the number of wavelength, object state mixtures, probe state mixtures, slices (for multislice ptychography), and rows as well as columns.
The Monitor class is used to display a reconstruction process. Equivalent to its Matlab counterpart, one or two figures are created depending on the verbosity level set by users. A default figure shows the updated object, probe, and reconstruction error. An optional figure shows the comparison of the measured and estimated diffraction patterns. The update frequency of the plots can also be controlled by users (Monitor.figureUpdateFrequency).
The Params class holds parameters that determine how a reconstruction is performed, for instance whether a reconstruction is carried out on a CPU or a GPU, the propagator type such as Fraunhofer, Fresnel, (scaled) angular spectrum (ASP, scaledASP), etc., and whether the order of position iterations is random or sequential. Switches and parameters of various regularization types are also included in the Params instance, for example controlling how frequenctly orthogonalization is applied in the context of a mixed-states reconstruction.
The Engine class consists of a BaseEngine as a parent class, and other child engine classes, for instance ePIE [43], mPIE [47], zPIE [39], aPIE [40], and qNewton [81]. All four instances of ExperimentalData, Reconstruction, Params, and Monitor are taken as inputs for a chosen engine, then get modified/updated by the engine, and can be passed to a different engine easily. Each engine stores its own attributes such as the number of iterations (numIteration), and the update step sizes for the probe/pupil (betaProbe) and object (betaObject).
3.4 Julia structure
PtyLab.jl is the most recent translation of PtyLab to Julia. Due to the differences in Julia to Matlab and Python consequently small differences exist in the implementation but most of the common principles still hold. The amount of features is less than in the other two packages since its focus was on performance first. The basis are four main types ExperimentalData, Reconstruction, Params, and Engines. Engines is an abstract type which is subtyped (indicated by <:) by specific algorithms (such as ePIE <: Engines) as composite types. Via this mechanism, generic functions can be commonly used by all Engines solvers. However, Julia’s multiple dispatch allows that different functionalities can be specified if they belong to a ePIE or a zPIE solver. Params is a composite type storing second-order-properties. Further, ExperimentalDataCPM <: ExperimentalData exists and similarly ReconstructionCPM <: Reconstruction to store the experimental data and the reconstruction data. During the iteration of the algorithms, the fields of ReconstructionCPM are allowed to change. Julia’s language features allow for a functional style of programming implying that memory buffers are not explicitly exposed to the user but instead are implicitly stored via closures.
4. Inverse modeling
The inverse modeling workflow in PtyLab consists of several modular processing steps, which are shown in Fig. 5. All optimization algorithms in PtyLab iterate between the object (CP) / object spectrum (FP) plane (orange) and the detector plane (green), where the two planes are linked via a suitable propagation model (yellow). We subdivide this section into several parts, describing the individual modules that the user can stack up to build customized data analysis pipelines.
4.1 Forward model
In CP and FP the goal is to retrieve a wide-field high-resolution reconstruction of a sample of interest. In addition, the probe (CP) and pupil (FP) of the imaging system are recovered. A forward model links the estimated detector intensity $I$ to the object $O$ and probe $P$ (CP) or object spectrum $\tilde {O}$ and pupil $\tilde {P}$ (FP),
Here $\mathcal {D}$ describes wave propagation between the sample (CP)/pupil (FP) and the detector plane, $\boldsymbol {r}$ refers to spatial coordinates, and $\boldsymbol {q}$ refers to reciprocal space coordinates. The index $j=1,\ldots,J$ denotes scan position. For simplicity, we drop the coordinate dependence and use the CP notation throughout; the conversion to FP of all results discussed below is straightforward. The symbol $O$ refers to a particular object box of equal size as the probe FOV (compare red region in Fig. 2). The entire object field of view is denoted by $O_{\textrm {FOV}}$ (compare blue region in Fig. 2).
In the presence of noise in the observed signal, for instance caused by photoelectric conversion and read-out, we cannot expect to find a combination of sample and prope/pupil that exactly matches the recorded data. In what follows we therefore assume the measured data $m$ to arise as a probabilistic response to the true intensity $I$ incident on the detector and discuss several maximum likelihood estimation (MLE) models [48,55,81–84]. These MLE models aim at estimating the most likely combination of object and probe, a viewpoint extended by the addition of maximum a posteriori (MAP) estimation, which originates from a Bayesian viewpoint and enables flexible embedding of regularization into the forward model [48,84,85]. Before going into the details of inverse models, we briefly review the continuous and discrete viewpoints on optimization, which are both encountered in the ptychography optimization literature [45,47,48,55,81–84,86–88].
4.2 Inverse modeling
In this section we review various forward models implemented in PtyLab. A summary of these forward models is given in Fig. 6. We first describe general techniques to tackle the inverse problem underlying ptychography. Subsequently, we detail the individual solvers that allow the user to build and invert modular forward models.
4.2.1 Continuous viewpoint
In the continuous viewpoint, we aim to minimize a cost functional
If Eq. (5) is not amenable to a direct solution, we can iteratively solve it by seeking the steady state solution of the diffusion equation [91,92]
4.2.2 Discrete viewpoint
The discrete viewpoint is used when considering inverse problems over sampled functions. In this case, we oftentimes wish to minimize the sum of squares cost function
A non-iterative solution is formally obtained by setting the gradient in Eq. (9) to zero and solving for $f,$
4.3 Maximum likelihood (MLE) estimation
We now discuss models for the detector noise commonly used in ptychography. Two particularly prominent models that have been addressed [48,55,81–83] are the Poisson likelihood
Computing the Wirtinger derivatives of the negative log-likelihood $\mathcal {L}=-\sum _{\boldsymbol {q}}\log \left (p\left [m\left |I\right.\right ]\right )$ of Eqs. (12) and (13) results in [48,55,81,83]
It appears to us that the Anscombe forward model is used by the vast majority in the ptychography literature. Although the Poisson distribution works in practice, we have observed that the Anscombe model is more robust in practical data analysis. One has to keep in mind that the Poisson model assumes that the photoelectric counting distribution’s mean equals its variance and is only valid for shot-noise limited data - a somewhat restrictive assumption considering the manifold fluctuations that are present in typical experiments, including partial spatial and temporal coherence effects as well as detector read out. We note that other models for the statistics of photoelectic counting distributions have been proposed in the literature, albeit to our knowledge they not yet have been used for ptychography. Noteworthy is the negative binomial distribution
4.4 Maximum a posteriori (MAP) estimation
The MLE approach in the previous subsection can be extended by MAP estimation, which introduces prior knowledge into the reconstruction process. In MAP the detector intensity is regarded as a random variable with underlying probability density $p\left [I\right ]$. Since the detector intensity $I=\tilde {\psi }^{*}\tilde {\psi }$ is a function of the real space object and probe, namely $\psi \left (\boldsymbol {r}\right )=P\left (\boldsymbol {r}\right )\cdot O\left (\boldsymbol {r}-\boldsymbol {r_{j}}\right )$ and $\tilde {\psi }\left (\boldsymbol {q}\right )=\mathcal {F}_{\boldsymbol {r}\rightarrow \boldsymbol {q}}\left [\psi \left (\boldsymbol {r}\right )\right ]$, MAP opens up a convenient way to formulate and impose constraints in the inverse problem underlying ptychography. The optimization problem is then
4.4.1 Proximal detector updates
The detector update can be restricted to small step sizes by introducing the proximal prior
These updates are referred to as proximal. A large value of the tuning parameter $\lambda$ forces the updated wave $\tilde {\psi }_{n+1}$ to remain in the proximity of the previous estimate $\tilde {\psi }_{n}$. Intuitively, the updates along the gradient directions in Eqs. (14) and (15) enforce the magnitude of the updated wave to be equal to the measured data, either in intensity or modulus for the Poisson and Anscombe gradients, respectively. However, due to noise, sequential update schemes can only be as certain as the noise in a single diffraction pattern permits. Proximal gradients incorporate the memory of previous updates and do not naively accept the update suggested by the data. The gradient direction suggested by the data is followed in case that the deviation from the current estimate is small. It is conjectured here that this incorporates dose fractionation effects into ptychography, resulting in superior signal-to-noise in the reconstruction. This is supported by previous reports that observed improved reconstruction quality using proximal gradients for Gerchberg-Saxton type phase retrieval [104] and ptychography [84,87]. The cost functions above (Eqs. (22) and (23)) result in the update steps
We note that the proximal Poisson update is different from the result in [104], since here the prior is Gaussian in modulus, while in the related work the prior is Gaussian in intensity. The approach reported here avoids the need to solve a cubic polynomial to compute the proximal Poisson update.
4.4.2 Proximal probe and object updates via (e)PIE
While in the previous section a proximal update step has been discussed in the detector update step, we may impose a similar type of regularization for the probe and object update, which has been shown to result in the ptychographic iterative engine (ePIE) [45]. This derivation is reviewed here from the discrete viewpoint outlined above. Considering the cost function
The particular choice
Later the extended ptychographic iterative engine (ePIE) was proposed [43], which uses the regularization matrix [45]
An intuition of the difference between PIE and ePIE can be obtained in the limit of $\epsilon \rightarrow 0$ and $\beta =1$, for which we get
$\Gamma _{\textrm {PIE}}$ is small, when $\left |P\right |$ approaches $\max \left (\left |P\right |\right )$ or $0$. Thus PIE allows object updates for locations where the probe amplitude is small or large. In contrast, $\Gamma _{\textrm {ePIE}}$ is small, only when $\left |P\right |$ approaches $\max \left (\left |P\right |\right )$. Thus ePIE allows object updates only for locations where the probe amplitude is large. The derivation of the probe update is similar, resulting in a joint optimization of $P$ and $O$. The robustness of ePIE is attributed to the penalized step size at low probe intensities. However, in FP the large dynamic range of the object spectrum can cause problems in conjunction with ePIE. The pupil would be updated only in the center of k-space, where the object spectrum exhibits values close to its maximum amplitude. High illumination angles produce dark field images, which have a reduced signal-to-noise ratio as compared to bright field images from lower illumination angles. In CP the illuminating beam is always aligned with the detector resulting in images with similar signal-to-noise ratio as compared between scan positions. For this reason, we use the PIE-type update by default in PtyLab for FP and the ePIE-type update for CP data analysis. Other choices of regularization can be embedded into the reconstruction routine by changing the weight matrix $\Gamma$. We note in passing that the PIE-type update rule has been identified as a quasi-Newton algorithm [81].
4.4.3 Tikhonov regularization
A popular idea in solving inverse problems is Tikhonov regularization. The general idea is to add an additional term to the cost function, which penalizes variations in the object,
We emphasize that the regularization term in Eq. (26) in the previous subsection penalizes fluctuations at for a fixed object pixel between successive iterations. In this subsection the regularization term in Eq. (34) penalizes fluctuations between neighboring object pixels. Applying functional gradient descent, as described by Eq. (7), to the cost in Eq. (34) gives
where4.4.4 Total variation regularization
Total variation (TV) regularization penalizes changes in the image while to a certain degree preserving edge features [91]. The corresponding cost function can be approximated by
Applying functional gradient descent (Eq. (7)) gives
4.5 Momentum acceleration (mPIE and mqNewton)
The momentum-accelerated ptychographic iterative engine (mPIE) [47] is the standard solver used for CP in PtyLab. In mPIE a predefined number of ePIE iterations $T$ is carried out, after which the search direction is complemented by a momentum term $\nu$ updating the entire object field of view $O_{\textrm {oFOV}}$,
Here $\eta$ is a damping term that is set to 0.7 by default [47]. Similar to conjugate gradient solvers [48,55,106], the momentum term accelerates the search direction and prevents zigzag motion towards the optimum. We emphasize that $O_{n+1,\textrm {oFOV}}$ in this subsection denotes the entire probe field of view, while in other subsection $O$ is an object box of the same size as the probe window.
While addition of momentum is typically done for another regularized version of the PIE-type family of algorithms (rPIE, see [47]), it can complement any of the existing reconstruction engines including PIE. To avoid naming ambiguities, addition of momentum to PIE will be referred to as momentum accelerated quasi-Newton (mqNewton), which we often use as an FP solver.
5. Robust inverse models
In the foregoing section, we have reviewed the basic inverse models underlying ptychography. However, oftentimes a variety of systematic errors are present in the experimental data that requires more robust inverse models. This is the case when the data is corrupted by for example partial spatial as well as partial temporal coherence, when the illumination wavefront profile is unstable throughout the scan, and when the scan positions are imprecisely known. In what follows, we discuss robust forward models that account for and mitigate the aforementioned sources of error.
5.1 Mixed States
In mixed state ptychography [45] the intensity in the detector plane is modeled as
where the index $k$ discerns mutually incoherent signal contributions (also known as mixed states). Inserting this into Eqs. (12) and (13) and calculating the Wirtinger derivative with respect to each $\tilde {\psi }_{k}$, we get the gradientsThe real space cost function for mixed state ptychography is
5.2 Multispectral ptychography
In multispectral ptychography [41,46,110–115] a light source of multiple individual spectral lines or a continuous spectrum is used. Because different colors are mutually incoherent, the detector update is identical to mixed state ptychography (compare Eq. (42)), but the index $k$ now denotes wavelength instead of spatial mode. The differences between mixed state and multispectral ptychography lie in the real space updates and in the propagator between the sample and the detector plane. In PtyLab we minimize the following cost function
These cost functions result in the updates
The latter update also results for the special case $\mu =0$. In this case the spectral channels are only coupled by the incoherent model for the detector intensity (Eq. (41)). In the presence of spectral regularization ($\mu \neq 0$) we do not need a priori knowledge about the spectral weights of the incident beam [41]. In the original work proposing multispectral ptychography no spectral regularization of adjacent channels was used. Instead the spectrum of the incident polychromatic probe was known a priori and used as a constraint in the optimization routine [46].
The second difference between mixed state and multispectral ptychography is the propagation model. Other code projects, for instance PtyPy [54], use zero padding to model the wavelength dependence of far-field wave propagation. In PtyLab the pixel size of a monochromatic wave can be scaled by using two-step propagators (scaledASP), which omits the need for spectrally dependent zero padding of the exit wave. For details the reader is referred to the supplementary information of [41].
5.3 Multislice ptychography (e3PIE)
In multislice CP [44] the specimen is modeled by a stack of 2D slices. The beam propagation method (BPM) [116] is used as a forward model for the exit wave. In each of the slices the thin element approximation is assumed to be valid. The cascade of multiplication with each slice and subsequent propagation enables the BPM to model multiple forward scattering effects. A basic version of multislice CP (termed e3PIE) is implemented in PtyLab. For details, the reader is referred to the original work by Maiden et al. [44] and subsequent work [117,118]. We have not yet implemented multislice FP [119] in the current version of PtyLab, although such an enigine may come in future releases.
5.4 Orthogonal probe relaxation (OPR)
A basic version of orthogonal probe relaxation (OPR) [51] is implemented in PtyLab. Instead of sharing the same probe across all scan positions in CP, as done for example in simple engines such as ePIE, OPR relaxes the requirement for a stable probe. The exit waves from different scan positions are used to estimate a low-rank basis, which seeks to model probe variations that occurred during a full scan, for example caused by pointing instability of the source. For details, the reader is referred to the original work by Odstrcil et al. [51]. We note that OPR can be combined with mixed states, as recently described by Eschen et al. [21].
With regard to FP, to our knowledge OPR has not been applied, although this may be an interesting approach to effectively model space-variant pupil functions. The latter is typically achieved by partitioning a larger field of view into a set of smaller sub-regions, each of which may be subject to different pupil aberrations. This approach, known as embedded pupil recovery (EPRY) [28], has to date essentially remained the only model for space-variant pupil aberrations in FP. However, because EPRY requires the reconstruction of a separate pupil for each sub-region, the model requires many degrees of freedom and ignores that adjacent sub-regions are unlikely to have strongly differing pupil aberrations. OPR could be a promising candidate to robustify EPRY in future FP applications for spatially varying aberration reconstruction. An example of the use of OPR in FP is shown in Fig. 7. The image FOV was split into small segments, each with its own unique pupil function and OPR was used to impose a low-rank consistency constraint on all the pupil functions. In applying OPR to FP adjacent pupils share information and poor convergence of some isolated field segments can be avoided (compare highlighted red boxes in Fig. 7). For further implementation details the reader is referred to [120].
5.5 Subsampling (sPIE)
In some applications it is challenging to sufficiently sample the captured detector signal. For example, in EUV ptychography generating a highly focused probe is oftentimes restricted by the available hardware [19]. In other applications, such as near-field ptychography, the detector pixel size can be a limiting factor [121]. In both situations one may attempt to solve for the probe and object in CP (or the pupil and object spectrum in FP) from undersampled measurements, being too coarse to oversample the diffraction data. In principle, the detrimental effect of undersampling can be compensated by high overlap. In the context of CP this technique is known as reciprocal space upsampling and was first demonstrated by Batey et al. [122] (where the algorithm was named sPIE). In the latter work, the captured ptychography measurements were deliberately undersampled by means of binning, but the original oversampled data could be recovered thanks to the high scan grid overlap in the captured data. We later generalized this principle to arbitrary sensing matrices that are not necessarily a result of an operation equivalent to binning, but that could result from any sensing architecture that compresses multiple, not necessarily neighboring pixels into a smaller data cube [42]. In such situations one seeks to minimize the cost function
5.6 Lateral position correction (pcPIE)
In ptychography the scan positions may not be accurately known, for example in the case of a low-precision scanning stage [68,86,123,124]. The wrong estimation of the scan positions will cause errors and artifacts during the stitching of the object patches into the large object field of view. In PtyLab a momentum-accelerated version of a cross-correlation-based lateral position correction algorithm is used [50]. The rationale of this position correction is based on the observation that, at iteration $n+1$ of the reconstruction procedure, the object patch estimate at scan position $j$ is slightly shifted towards its true position. This shift is detected and used to update the scan grid by maximizing the cross correlation
The updated scan position at iteration $n+1$ is
Default values are $\alpha = 250$, $\beta = 0.9$, and $d_0=0$.
5.7 Reflection ptychography with angle calibration (aPIE)
In reflection ptychography the sample plane and the detector are non-coplanar. Assuming far-field diffraction for simplicity, the captured data is related to the specimen exit wave by a Fourier transformation plus an additional coordinate transformation [17,125]. An inverse coordinate transformation can be applied to the captured raw data to simplify the forward model. However, this operation requires accurate knowledge of the angle between the optical axis and the specimen surface normal. If this angle is not calibrated within a fraction of a degree, the reconstruction quality can suffer notably. We have recently presented an algorithm for angular auto-calibration in reflection ptychography (aPIE), which is part of PtyLab. aPIE uses a heuristic strategy to estimate the unknown angle within an iteratively shrinking search interval. For details the reader is referred to [40].
5.8 Axial position correction (zPIE)
Similarly to pcPIE (position correction) and aPIE (angle correction in reflection ptychography) another self-calibration algorithm provided as part of PtyLab is zPIE, which can be used to estimate the sample-detector distance [39]. The main idea is that when the sample-detector distance is miscalibrated, the reconstructed object oftentimes exhibits characteristics of a slightly defocused inline hologram, including ringing at edges. An autofocus metric based on total variation (TV) is then used to calibrate the correct sample detector distance. We observed that TV-based autofocusing performs best in the near-field on binary specimens, although it can also be used on biological specimens. Other choices of autofocusing metrics can easily be implemented by the user, if the TV-based sharpness metric fails [126].
5.9 Ptychography combined with an external reference beam
We recently reported ptychographic optical coherence tomography, which combines full field frequency-domain OCT with ptychography [25]. In the latter work there was no need for an external reference wave, as common in OCT applications. Instead the reference was provided from a direct surface reflection of the sample itself. Thus the technique can principally be applied to the short-wavelength regime, where providing an external reference comes with extra experimental challenges. However, in the visible and infrared spectral range a reference wave is readily provided and can make POCT more convenient. Providing an external reference wave in ptychography requires adjustments to the forward model. In this case, we seek to minimize the cost function density
where $\tilde {\rho }$ denotes a coherent external reference wave. All other quantities are the same as defined as in previous sections. Using gradient descent with a unit step size in conjunction with Wirtinger derivatives, we obtain updates for both the wave diffracted from the specimenWe note that the mathematical structure of external reference beam ptychography opens up a trivial ambiguity. Suppose that the triplet of probe $P$, object $O$, and reference $\tilde {\rho }$ yields the observed intenisty $I$, i.e.
Then it immediately follows that the triplet of probe $P$, object $-O$, and reference $\tilde {P}+\tilde {\rho }$ is also a solution, since
6. Scan grid optimization
In CP, a certain amount of consideration is needed to optimize the scan trajectory. To date, the majority of CP setups employ mechanical scanners, although variants exist where the beam is rapidly steered over the sample by means of galvo mirrors [127]. The latter offers advantages in terms of speed and overall cost of the experimental setup, but the isoplanatic illumination patch of such mirror systems is finite and thus limits the field of view over which the probe wavefront can be assumed to be stable, thus compromising one of the very benefits of CP. Hence mechanical scanners are still the preferred option. For such systems it is important to minimize the total scan distance in order to reduce scan time and prevent mechanical wear. In addition, unlike other scanning microscopy systems ptychography requires non-periodic scan grids to avoid ambiguities in the reconstruction [128]. A popular choice are Fermat scan grids [129], as depicted in Fig. 9(a). This type of scan grid is conveniently described in polar coordinates, where its trajectory assumes the form of a spiral. Minimizing to total travel path can be done using a solver for the traveling salesman problem (TSP). In PtyLab, we use a 2opt [130] TSP heuristic solver, which offers a good compromise between optimality and optimization time. Figure 9(b) shows an example of a distance-optimized scan trajectory, where the color scale indicates the start (blue) and end position (red).
Moreover, several operations are available to transform scan grids, including non-uniform scaling and rectification as shown Fig. 9(c) and (d), respectively. The former allows for non-uniform spatial sampling, adjusted such that the sampling is higher in regions that are challenging to resolve, while the latter clips the field of view to a rectangular (here square) region. Another practically useful strategy is checkpointing (see Fig. 9(e)), which alters a given scan grid such that it revisits a certain reference point throughout the scan. Deviations in the diffraction data at the checkpoints allow for identifying sources of error in the experimental setup, including position drift, flux instability, and illumination wavefront variations. The checkpoints are equi-spaced in time.
The aforementioned techniques are primarily scan grid preprocessing techniques, meaning that the scan grid is optimized prior to the actual experiment. After the data acquisition, scan grid postprocessing techniques may be required. For example, large scan grids can be partitioned to prevent memory limitations. In this way large data sets may be chopped up into smaller pieces which are then individually processed [131]. It is important that the scan partitions spatially overlap, so that adjacent regions can be phase synchronized and stitched up post-reconstruction. To ensure overlap between the clusters, an overlapping k-means (OKM) algorithm can be used [132,133]. Figure 9(f) shows an example of a scan grid partitioned into four overlapping clusters (filled circles/triangles, unfilled diamonds/squares), each containing 160 scan points. In the middle, the clusters overlap. A second reason for scan grid partitioning can be to define batch gradients which speed up convergence and robustness [55].
As a note on FP scan grids, at first sight it appears surprising that the technique does not exhibit raster scan artefacts although the most commonly employed LED arrays are typically regularly spaced. However, the typically regular LED spatial arrangement still corresponds to a non-periodic spacing in angle, which explains that it is not subject to the aforementioned raster scan artefacts. While most of the aforementioned preprocessing steps are not required for FP due to the absence of mechanical movement, checkpointing and partitioning may still be used for monitoring stability and distributed data analysis, respectively.
7. Open experimental data and tutorials
We publish a variety of CP and FP data sets and tutorials with the aim to introduce users to the functionality of PtyLab. Figure 10 depicts two such data sets. The top row shows a soft x-ray ($\lambda =2.48$nm) data set collected at a synchrotron (experimental details in [76]). The bottom row depicts a visible light ($\lambda =625$nm) FP data set of lung carcinoma. For both data sets we show from left to right a single frame of the raw data, the recovered quantitative phase image (QPI) of the object (resolution test target and lung carcinoma histology slide), and reconstructed probe/pupil for the case of CP (top) and FP (bottom). Hue and brightness depict the phase and amplitude, respectively, of the complex-valued reconstructed quantities. A variety of additional data sets are published alongside PtyLab, which are summarized in Table 1. Each of these data sets comes with an online tutorial explaining suitable data analysis approaches, including self-calibration and regularization.
8. Discussion and conclusion
PtyLab is a versatile ptychography software which we hope will aid researchers to explore the capabilities of CP and FP. Nevertheless, despite our excitement about this endeavor, we should mention some of its shortcomings: (1) Researchers with large-scale and high-throughput data analysis tasks (e.g. beamline scientists at synchrotrons) may be better off with one of the currently available high-performance ptychography packages mentioned in the introduction. However, we believe increased performance comes at the cost of flexibility in algorithm prototyping. (2) PtyLab currently does not support tomographic reconstruction with specimen rotation. It is to be noted that external CT toolboxes, such as Astra [138] or Tigre [139], can principally be used for ptychographic computed tomography once a sequence of 2D reconstructions at different angles is available. However, some ptychotomographic software embed specialized regularization techniques within the reconstruction routine [140], which are not available in standard CT packages. In contrast, ptychographic optical coherence tomography (POCT) [25] does not require angle diversity and 3D reconstructions can be obtained simply by performing a Fourier transform along the wavelength dimension in a multispectral reconstruction object stack. The latter is readily performed in PtyLab.
We have built PtyLab in an object-oriented and highly modular fashion. This enables the user to assemble reconstruction algorithms from elementary algorithmic building blocks (e.g. exit wave formation, propagation model, detector intensity and noise and model, partial coherence, experimental setup parameters), which interact seamlessly and concurrently. The parameters of a forward model can be modified during a reconstruction session.
We have designed PtyLab based on the principle of reciprocity. An interesting implication of the conversion between CP and FP is performance, as it provides the freedom to choose whether the computational complexity of the Fourier transform operation, used in typical inversion algorithms, scales logarithmically with the number of pixels in the detector or with the number of scan positions in a given data cube - numbers which can be orders of magnitude apart so that even on a logarithmic scale, practical speed ups can be achieved. However, in order to take full advantage of reciprocity, interpolation techniques to non-equidistantly sampled geometries are required, which will be explored in future work.
In summary, we have presented PtyLab, a cross-platform, open source inverse modeling toolbox for CP and FP. We believe PtyLab’s major strengths lie in (1) the uniform framework for CP and FP enabling cross-pollination between the two domains, (2) the availability in three widely used programming languages (Matlab, Python, and Julia), making it easy for researchers with different programming backgrounds to exchange and benchmark code snippets and data anlyses, and (3) its versatile code architecture suited both for beginners and experts interested in rapid ptychographic algorithm prototyping. In addition, a plethora of self-calibration features (e.g. aPIE, zPIE) and algorithmic novelties (e.g. conversion between CP and FP, POCT, CP with external reference beam, sPIE) are available that to our knowledge have previously not been featured in open access ptychography code. Various functions for scan grid generation help the user to optimize data acquisition and postprocessing. For further information the reader is referred to the GitHub website with its accompanying tutorials as well as the open data provided along with it [38].
Appendix
Equivalence of CP and FP
In this appendix, we provide a formal proof that the same data cube can be regarded as a CP or an FP data set, implying the ability to convert between the two. Without loss of generality, we assume we are given a far-field CP data set
Next, consider a fixed observation pixel ($\boldsymbol {q}_{0}$) in the data cube in Eq. (67)
FP solves the problem formulation in Eq. (72) for a sequence of illumination directions. Thus we may consider the same data cube to be either a CP or an FP inverse problem. From the CP perspective we reconstruct P and O, while from the FP perspective we reconstruct $\tilde {P}$ and $\tilde {O}$. Thus if we tackle a CP data from the FP perspective, we simply inverse Fourier transform the reconstructed object spectrum to retrieve the object that we would have reconstructed had we directly chosen a CP solver. A similar statement holds for the correspondence between probe and pupil.
Funding
Schweizerischer Nationalfonds zur Förderung der Wissenschaftlichen Forschung (200021_19689); European Research Council (ERC-CoG 864016); Nederlandse Organisatie voor Wetenschappelijk Onderzoek (LINX Perspectief Programme).
Acknowledgements
Python packages: numpy [141], matplotlib [142], h5py [143], scipy [144], scikit-image [145], tqdm [146]
PtyLab.jl was implemented in Julia [147] and made use of the following packages: CUDA.jl [148,149], FFTW.jl [150,151] and LLVM.jl [152].
Disclosures
The authors declare no conflicts of interest.
Data availability
The Python, Matlab, and Julia versions of PtyLab are available online [38]. Data underlying the results presented in this paper are available in the tutorials of [38].
References
1. H. M. L. Faulkner and J. M. Rodenburg, “Movable Aperture Lensless Transmission Microscopy : A Novel Phase Retrieval Algorithm,” Phys. Rev. Lett. 93(2), 023903 (2004). [CrossRef]
2. J. Rodenburg and A. Maiden, “Ptychography,” in Springer Handbook of Microscopy, (Springer, 2019), p. 2.
3. H. H. Pattee, “The scanning x-ray microscope,” J. Opt. Soc. Am. 43(1), 61–62 (1953). [CrossRef]
4. P. Horowitz and J. A. Howell, “A scanning x-ray microscope using synchrotron radiation,” Science 178(4061), 608–611 (1972). [CrossRef]
5. H. Rarback, D. Shu, S. C. Feng, H. Ade, J. Kirz, I. McNulty, D. P. Kern, T. H. P. Chang, Y. Vladimirsky, N. Iskander, D. Attwood, K. McQuaid, and S. Rothman, “Scanning x-ray microscope with 75-nm resolution,” Rev. Sci. Instrum. 59(1), 52–59 (1988). [CrossRef]
6. C. Jacobsen, S. Williams, E. Anderson, M. T. Browne, C. J. Buckley, D. Kern, J. Kirz, M. Rivers, and X. Zhang, “Diffraction-limited imaging in a scanning transmission x-ray microscope,” Opt. Commun. 86(3-4), 351–364 (1991). [CrossRef]
7. J. M. Rodenburg, A. C. Hurst, A. G. Cullis, B. R. Dobson, F. Pfeiffer, O. Bunk, C. David, K. Jefimovs, and I. Johnson, “Hard-X-ray lensless imaging of extended objects,” Phys. Rev. Lett. 98(3), 034801 (2007). [CrossRef]
8. P. Thibault, M. Dierolf, A. Menzel, O. Bunk, C. David, and F. Pfeiffer, “High-Resolution Scanning X-ray Diffraction Microscopy,” Science 321(5887), 379–382 (2008). [CrossRef]
9. P. Thibault, M. Dierolf, C. M. Kewish, A. Menzel, O. Bunk, and F. Pfeiffer, “Contrast mechanisms in scanning transmission x-ray microscopy,” Phys. Rev. A 80(4), 043813 (2009). [CrossRef]
10. F. Pfeiffer, “X-ray ptychography,” Nat. Photonics 12(1), 9–17 (2018). [CrossRef]
11. T. Wang, S. Jiang, P. Song, R. Wang, L. Yang, T. Zhang, and G. Zheng, “Optical ptychography for biomedical imaging: recent progress and future directions,” Biomed. Opt. Express 14(2), 489–532 (2023). [CrossRef]
12. F. Hüe, J. M. Rodenburg, A. M. Maiden, F. Sweeney, and P. A. Midgley, “Wave-front phase retrieval in transmission electron microscopy via ptychography,” Physical Review B - Condensed Matter and Materials Physics 82(12), 121415 (2010). [CrossRef]
13. C. T. Putkunz, A. J. D’Alfonso, A. J. Morgan, M. Weyland, C. Dwyer, L. Bourgeois, J. Etheridge, A. Roberts, R. E. Scholten, K. A. Nugent, and L. J. Allen, “Atom-scale ptychographic electron diffractive imaging of boron nitride cones,” Phys. Rev. Lett. 108(7), 073901 (2012). [CrossRef]
14. M. J. Humphry, B. Kraus, A. C. Hurst, A. M. Maiden, and J. M. Rodenburg, “Ptychographic electron microscopy using high-angle dark-field scattering for sub-nanometre resolution imaging,” Nat. Commun. 3(1), 730 (2012). [CrossRef]
15. Y. Jiang, Z. Chen, Y. Han, P. Deb, H. Gao, S. Xie, P. Purohit, M. W. Tate, J. Park, S. M. Gruner, V. Elser, and D. A. Muller, “Electron ptychography of 2D materials to deep sub-ångström resolution,” Nature 559(7714), 343–349 (2018). [CrossRef]
16. Z. Chen, Y. Jiang, Y.-T. Shao, M. E. Holtz, M. Odstrčil, M. Guizar-Sicairos, I. Hanke, S. Ganschow, D. G. Schlom, and D. A. Muller, “Electron ptychography achieves atomic-resolution limits set by lattice vibrations,” Science 372(6544), 826–831 (2021). [CrossRef]
17. M. D. Seaberg, B. Zhang, D. F. Gardner, E. R. Shanblatt, M. M. Murnane, H. C. Kapteyn, and D. E. Adams, “Tabletop nanometer extreme ultraviolet imaging in an extended reflection mode using coherent Fresnel ptychography,” Optica 1(1), 39 (2014). [CrossRef]
18. D. F. Gardner, M. Tanksalvala, E. R. Shanblatt, X. Zhang, B. R. Galloway, C. L. Porter, R. Karl Jr, C. Bevis, D. E. Adams, H. C. Kapteyn, M. M. Murnane, and G. F. Mancini, “Subwavelength coherent imaging of periodic samples using a 13.5 nm tabletop high-harmonic light source,” Nat. Photonics 11(4), 259–263 (2017). [CrossRef]
19. L. Loetgering, S. Witte, and J. Rothhardt, “Advances in laboratory-scale ptychography using high harmonic sources,” Opt. Express 30(3), 4133–4164 (2022). [CrossRef]
20. M. Tanksalvala, C. L. Porter, Y. Esashi, et al., “Nondestructive, high-resolution, chemically specific 3D nanostructure characterization using phase-sensitive EUV imaging reflectometry,” Sci. Adv. 7(5), 9667–9694 (2021). [CrossRef]
21. W. Eschen, L. Loetgering, V. Schuster, R. Klas, A. Kirsche, L. Berthold, M. Steinert, T. Pertsch, H. Gross, and M. Krause, “Material-specific high-resolution table-top extreme ultraviolet microscopy,” Light: Science & Applications 11(1), 117–210 (2022). [CrossRef]
22. C. Liu, W. Eschen, L. Loetgering, D. S. Penagos Molina, R. Klas, A. Iliou, M. Steinert, S. Herkersdorf, A. Kirsche, T. Pertsch, F. Hillmann, J. Limpert, and J. Rothhardt, “Visualizing the ultra-structure of microorganisms using table-top extreme ultraviolet imaging,” PhotoniX 4(1), 1–15 (2023). [CrossRef]
23. A. M. Maiden, M. J. Humphry, F. Zhang, and J. M. Rodenburg, “Superresolution imaging via ptychography,” Journal of the Optical Society of America A 28(4), 604–612 (2011). [CrossRef]
24. J. Marrison, L. Räty, P. Marriott, and P. O’Toole, “Ptychography–a label free, high-contrast imaging technique for live cells using quantitative phase information,” Sci. Rep. 3(1), 2369 (2013). [CrossRef]
25. M. Du, L. Loetgering, K. S. E. Eikema, and S. Witte, “Ptychographic optical coherence tomography,” Opt. Lett. 46(6), 1337 (2021). [CrossRef]
26. L. Valzania, T. Feurer, P. Zolliker, and E. Hack, “Terahertz ptychography,” Opt. Lett. 43(3), 543 (2018). [CrossRef]
27. G. Zheng, R. Horstmeyer, and C. Yang, “Wide-field, high-resolution Fourier ptychographic microscopy,” Nat. Photonics 7(9), 739–745 (2013). [CrossRef]
28. X. Ou, G. Zheng, and C. Yang, “Embedded pupil function recovery for Fourier ptychographic microscopy,” Opt. Express 22(5), 4960 (2014). [CrossRef]
29. P. C. Konda, L. Loetgering, K. C. Zhou, S. Xu, A. R. Harvey, and R. Horstmeyer, “Fourier ptychography: current applications and future promises,” Opt. Express 28(7), 9603 (2020). [CrossRef]
30. G. Zheng, C. Shen, S. Jiang, P. Song, and C. Yang, “Concept, implementations and applications of Fourier ptychography,” Nat. Rev. Phys. 3(3), 207–223 (2021). [CrossRef]
31. S. Sen, I. Ahmed, B. Aljubran, A. A. Bernussi, and L. G. De Peralta, “Fourier ptychographic microscopy using an infrared-emitting hemispherical digital condenser,” Appl. Opt. 55(23), 6421–6427 (2016). [CrossRef]
32. K. Wakonig, A. Diaz, A. Bonnin, M. Stampanoni, A. Bergamaschi, J. Ihli, M. Guizar-Sicairos, and A. Menzel, “X-ray Fourier ptychography,” Sci. Adv. 5(2), aav0282 (2019). [CrossRef]
33. J. M. Rodenburg and R. H. T. Bates, “The Theory of Super-Resolution Electron Microscopy Via Wigner-Distribution Deconvolution,” Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 339, 521–553 (1992).
34. H. N. Chapman, “Phase-retrieval X-ray microscopy by Wigner-distribution deconvolution,” Ultramicroscopy 66(3-4), 153–172 (1996). [CrossRef]
35. P. Li, T. B. Edo, and J. M. Rodenburg, “Ptychographic inversion via Wigner distribution deconvolution: noise suppression and probe design,” Ultramicroscopy 147, 106–113 (2014). [CrossRef]
36. R. Horstmeyer and C. Yang, “A phase space model of Fourier ptychographic microscopy,” Opt. Express 22(1), 338–358 (2014). [CrossRef]
37. J. C. da Silva and A. Menzel, “Elementary signals in ptychography,” Opt. Express 23(26), 33812 (2015). [CrossRef]
38. “PtyLab,” https://github.com/PtyLab.
39. L. Loetgering, M. Du, K. S. E. Eikema, and S. Witte, “zPIE: an autofocusing algorithm for ptychography,” Opt. Lett. 45(7), 2030 (2020). [CrossRef]
40. A. de Beurs, L. Loetgering, M. Herczog, M. Du, K. S. E. Eikema, and S. Witte, “aPIE: an angle calibration algorithm for reflection ptychography,” Opt. Lett. 47(8), 1949–1952 (2022). [CrossRef]
41. L. Loetgering, X. Liu, A. C. C. De Beurs, M. Du, G. Kuijper, K. S. E. Eikema, and S. Witte, “Tailoring spatial entropy in extreme ultraviolet focused beams for multispectral ptychography,” Optica 8(2), 130 (2021). [CrossRef]
42. L. Loetgering, M. Rose, D. Treffer, I. A. Vartanyants, A. Rosenhahn, and T. Wilhein, “Data compression strategies for ptychographic diffraction imaging,” Adv. Opt. Technol. 6(6), 475–483 (2017). [CrossRef]
43. A. M. Maiden and J. M. Rodenburg, “An improved ptychographical phase retrieval algorithm for diffractive imaging,” Ultramicroscopy 109(10), 1256–1262 (2009). [CrossRef]
44. A. M. Maiden, M. J. Humphry, and J. M. Rodenburg, “Ptychographic transmission microscopy in three dimensions using a multi-slice approach,” Journal of the Optical Society of America A 29(8), 1606 (2012). [CrossRef]
45. P. Thibault and A. Menzel, “Reconstructing state mixtures from diffraction measurements,” Nature 494(7435), 68–71 (2013). [CrossRef]
46. D. J. Batey, D. Claus, and J. M. Rodenburg, “Information multiplexing in ptychography,” Ultramicroscopy 138, 13–21 (2014). [CrossRef]
47. A. Maiden, D. Johnson, and P. Li, “Further improvements to the ptychographical iterative engine,” Optica 4(7), 736 (2017). [CrossRef]
48. P. Thibault and M. Guizar-Sicairos, “Maximum-likelihood refinement for coherent diffractive imaging,” New J. Phys. 14(6), 063004 (2012). [CrossRef]
49. M. Stockmar, I. Zanette, M. Dierolf, B. Enders, R. Clare, F. Pfeiffer, T. U. München, and P. Thibault, “X-Ray Near-Field Ptychography for Optically Thick Specimens,” Phys. Rev. Appl. 3, 014005 (2015). [CrossRef]
50. F. Zhang, I. Peterson, J. Vila-Comamala, A. Diaz, F. Berenguer, R. Bean, B. Chen, A. Menzel, I. K. Robinson, and J. M. Rodenburg, “Translation position determination in ptychographic coherent diffraction imaging,” Opt. Express 21(11), 13592–606 (2013). [CrossRef]
51. M. Odstrcil, P. Baksh, S. A. Boden, R. Card, J. E. Chad, J. G. Frey, and W. S. Brocklesby, “Ptychographic coherent diffractive imaging with orthogonal probe relaxation,” Opt. Express 24(8), 8360–8369 (2016). [CrossRef]
52. Y. S. G. Nashed, D. J. Vine, T. Peterka, J. Deng, R. Ross, and C. Jacobsen, “Parallel ptychographic reconstruction,” Opt. Express 22(26), 32082–32097 (2014). [CrossRef]
53. S. Marchesini, H. Krishnan, B. J. Daurer, D. A. Shapiro, T. Perciano, J. A. Sethian, and F. R. N. C. Maia, “SHARP: a distributed GPU-based ptychographic solver,” J. Appl. Crystallogr. 49(4), 1245–1252 (2016). [CrossRef]
54. B. Enders and P. Thibault, “A computational framework for ptychographic reconstructions,” Proc. R. Soc. A 472(2196), 20160640 (2016). [CrossRef]
55. M. Odstrcil, A. Menzel, and M. Guizar-Sicairos, “Iterative least-squares solver for generalized maximum-likelihood ptychography,” Opt. Express 26(3), 3108 (2018). [CrossRef]
56. Z. Dong, Y.-L. L. Fang, X. Huang, H. Yan, S. Ha, W. Xu, Y. S. Chu, S. I. Campbell, and M. Lin, “High-performance multi-mode ptychography reconstruction on distributed GPUs,” in 2018 New York Scientific Data Summit (NYSDS), (2018), pp. 1–5.
57. K. Wakonig, H.-C. Stadler, M. Odstrčil, E. H. R. Tsai, A. Diaz, M. Holler, I. Usov, J. Raabe, A. Menzel, and M. Guizar-Sicairos, “PtychoShelves, a versatile high-level framework for high-performance analysis of ptychographic data,” J. Appl. Crystallogr. 53(2), 574–586 (2020). [CrossRef]
58. V. Favre-Nicolin, G. Girard, S. Leake, J. Carnis, Y. Chushkin, J. Kieffer, P. Paleo, and M.-I. Richard, “PyNX: high-performance computing toolkit for coherent X-ray imaging based on operators,” J. Appl. Crystallogr. 53(5), 1404–1413 (2020). [CrossRef]
59. X. Yu, V. Nikitin, D. J. Ching, S. Aslan, D. Gursoy, and T. Bicer, “Scalable and accurate multi-GPU based image reconstruction of large-scale ptychography data,” arXiv, 2106.07575 (2021). [CrossRef]
60. S. Kandel, S. Maddali, M. Allain, S. O. Hruszkewycz, C. Jacobsen, and Y. S. G. Nashed, “Using automatic differentiation as a general framework for ptychographic reconstruction,” Opt. Express 27(13), 18653–18672 (2019). [CrossRef]
61. J. Seifert, D. Bouchet, L. Loetgering, and A. P. Mosk, “Efficient and flexible approach to ptychography using an optimization framework based on automatic differentiation,” OSA Continuum 4(1), 121–128 (2021). [CrossRef]
62. K. Kreutz-Delgado, “The complex gradient operator and the CR-calculus,” arXiv, 0906.4835 (2009). [CrossRef]
63. D. H. Brandwood, “A complex gradient operator and its application in adaptive array theory,” IEE Proceedings H-Microwaves, Optics and Antennas 130(1), 11–16 (1983). [CrossRef]
64. M. J. Cherukara, T. Zhou, Y. Nashed, P. Enfedaque, A. Hexemer, R. J. Harder, and M. V. Holt, “AI-enabled high-resolution scanning coherent diffraction imaging,” Appl. Phys. Lett. 117(4), 044103 (2020). [CrossRef]
65. S. Aslan, Z. Liu, V. Nikitin, T. Bicer, S. Leyffer, and D. Gürsoy, “Joint ptycho-tomography with deep generative priors,” Machine Learning: Science and Technology 2, 45017 (2021). [CrossRef]
66. R. Harder, “Deep neural networks in real-time coherent diffraction imaging,” IUCrJ 8(1), 1–3 (2021). [CrossRef]
67. M. Stockmar, P. Cloetens, I. Zanette, B. Enders, M. Dierolf, F. Pfeiffer, and P. Thibault, “Near-field ptychography: phase retrieval for inline holography using a structured illumination,” Sci. Rep. 3(1), 1927 (2013). [CrossRef]
68. A. M. Maiden, M. J. Humphry, M. C. Sarahan, B. Kraus, and J. M. Rodenburg, “An annealing algorithm to correct positioning errors in ptychography,” Ultramicroscopy 120, 64–72 (2012). [CrossRef]
69. L. Tian, X. Li, K. Ramchandran, and L. Waller, “Multiplexed coded illumination for Fourier Ptychography with an LED array microscope,” Biomed. Opt. Express 5(7), 2376–2389 (2014). [CrossRef]
70. G. Zheng, Fourier ptychographic imaging: a MATLAB tutorial (Morgan and Claypool Publishers, 2016).
71. L. Bian, G. Zheng, K. Guo, J. Suo, C. Yang, F. Chen, and Q. Dai, “Motion-corrected Fourier ptychography,” Biomed. Opt. Express 7(11), 4543–4553 (2016). [CrossRef]
72. C. Zuo, J. Sun, and Q. Chen, “Adaptive step-size strategy for noise-robust Fourier ptychographic microscopy,” Opt. Express 24(18), 20724–20744 (2016). [CrossRef]
73. T. Aidukas, R. Eckert, A. R. Harvey, L. Waller, and P. C. Konda, “Low-cost, sub-micron resolution, wide-field computational microscopy using opensource hardware,” Sci. Rep. 9(1), 7457 (2019). [CrossRef]
74. K. C. Zhou and R. Horstmeyer, “Diffraction tomography with a deep image prior,” Opt. Express 28(9), 12872–12896 (2020). [CrossRef]
75. M. Rogalski, P. Zdańkowski, and M. Trusiak, “FPM app: an open-source MATLAB application for simple and intuitive Fourier ptychographic reconstruction,” Bioinformatics 37(20), 3695–3696 (2021). [CrossRef]
76. L. Loetgering, M. Baluktsian, K. Keskinbora, R. Horstmeyer, T. Wilhein, G. Schütz, K. S. Eikema, and S. Witte, “Generation and characterization of focused helical x-ray beams,” Sci. Adv. 6(7), eaax8836 (2020). [CrossRef]
77. M. Born and E. Wolf, Principles of Optics (Cambridge University Press, 1999), 7th ed.
78. M. Folk, G. Heber, Q. Koziol, E. Pourmal, and D. Robinson, “An overview of the HDF5 technology suite and its applications,” in Proceedings of the EDBT/ICDT 2011 Workshop on Array Databases, (2011), pp. 36–47.
79. R. Eckert, Z. F. Phillips, and L. Waller, “Efficient illumination angle self-calibration in Fourier ptychography,” Appl. Opt. 57(19), 5434–5442 (2018). [CrossRef]
80. S. L. Brunton and J. N. Kutz, Data-driven science and engineering: Machine learning, dynamical systems, and control (Cambridge University Press, 2022).
81. L.-H. Yeh, J. Dong, J. Zhong, L. Tian, M. Chen, G. Tang, M. Soltanolkotabi, and L. Waller, “Experimental robustness of Fourier ptychography phase retrieval algorithms,” Opt. Express 23(26), 33214 (2015). [CrossRef]
82. R. A. Dilanian, G. J. Williams, L. W. Whitehead, D. J. Vine, A. G. Peele, E. Balaur, I. McNulty, H. M. Quiney, and K. A. Nugent, “Coherent diffractive imaging: a new statistically regularized amplitude constraint,” New J. Phys. 12(9), 093042 (2010). [CrossRef]
83. P. Godard, M. Allain, V. Chamard, and J. Rodenburg, “Noise models for low counting rate coherent diffraction imaging,” Opt. Express 20(23), 25914 (2012). [CrossRef]
84. H. Yan, “Ptychographic phase retrieval by proximal algorithms,” New J. Phys. 22(2), 023035 (2020). [CrossRef]
85. P. M. Pelz, W. X. Qiu, R. Bücker, G. Kassier, and R. J. Miller, “Low-dose cryo electron ptychography via non-convex Bayesian optimization,” Sci. Rep. 7(1), 9883 (2017). [CrossRef]
86. M. Guizar-Sicairos and J. R. Fienup, “Phase retrieval with transverse translation diversity: a nonlinear optimization approach,” Opt. Express 16(10), 7264–7278 (2008). [CrossRef]
87. S. Marchesini, A. Schirotzek, C. Yang, H. T. Wu, and F. Maia, “Augmented projections for ptychographic imaging,” Inverse Problems 29(11), 115009 (2013). [CrossRef]
88. R. Horstmeyer, R. Chen, X. Ou, B. Ames, J. Tropp, and C. Yang, “Solving ptychography with a convex relaxation,” New J. Phys. 17(5), 053044 (2015). [CrossRef]
89. G. Gbur, Mathematical Methods for Optical Physics and Engineering (Cambridge University Press, Cambridge, 2011), 1st ed.
90. B. Frieden, Probability, Statistical Optics, and Data Testing (Springer, New York, 1991), 2nd ed.
91. L. I. Rudin, S. Osher, and E. Fatemi, “Nonlinear total variation based noise removal algorithms,” Physica D Nonlinear Phenomena 60(1-4), 259–268 (1992). [CrossRef]
92. M. Benyamin, J. Calder, G. Sundaramoorthi, and A. Yezzi, “Accelerated Variational PDEs for Efficient Solution of Regularized Inversion Problems,” Journal of Mathematical Imaging and Vision 62(1), 10–36 (2020). [CrossRef]
93. E. J. Candès, X. Li, and M. Soltanolkotabi, “Phase retrieval via wirtinger flow: Theory and algorithms,” IEEE Trans. Inf. Theory 61(4), 1985–2007 (2015). [CrossRef]
94. R. Xu, M. Soltanolkotabi, J. P. Haldar, W. Unglaub, J. Zusman, A. F. J. Levi, and R. M. Leahy, “Accelerated Wirtinger Flow: A fast algorithm for ptychography,” arXiv, 1806.05546 (2018). [CrossRef]
95. S. Boyd and L. Vandenberghe, Introduction to Applied Linear Algebra (Cambridge University Press, 2019).
96. R. Bracewell, Fourier Analysis and Imaging, 1st ed. (Springer, 2003).
97. A. Chakrabarti and T. Zickler, “Image Restoration with Signal-dependent Camera Noise,” (2012).
98. E. Chouzenoux, A. Jezierska, J. C. Pesquet, and H. Talbot, “A convex approach for image restoration with exact Poisson—Gaussian likelihood,” SIAM Journal on Imaging Sciences 8(4), 2662–2682 (2015). [CrossRef]
99. H. Ikoma, M. Broxton, T. Kudo, and G. Wetzstein, “A convex 3D deconvolution algorithm for low photon count fluorescence imaging,” Sci. Rep. 8(1), 11489 (2018). [CrossRef]
100. M. Mäkitalo and A. Foi, “Optimal inversion of the anscombe transformation in low-count poisson image denoising,” IEEE Transactions on Image Processing 20(1), 99–109 (2011). [CrossRef]
101. F. Murtagh, J.-L. Starck, and A. Bijaoui, “Image restoration with noise suppression using a multiresolution support,” Astronomy and Astrophysics Supplement Series 112, 179 (1995).
102. L. Mandel, “Fluctuations of photon beams: the distribution of the photo-electrons,” Proceedings of the Physical Society (1958-1967) 74(3), 233–243 (1959). [CrossRef]
103. J. W. Goodman, Statistical Optics (Wiley, New York, 2015), 2nd ed.
104. F. Soulez, É. Thiébaut, A. Schutz, A. Ferrari, F. Courbin, and M. Unser, “Proximity operators for phase retrieval,” Appl. Opt. 55(26), 7412 (2016). [CrossRef]
105. J. M. Rodenburg and H. M. L. Faulkner, “A phase retrieval algorithm for shifting illumination,” Appl. Phys. Lett. 85(20), 4795–4797 (2004). [CrossRef]
106. J. Qian, C. Yang, A. Schirotzek, F. Maia, and S. Marchesini, “Efficient algorithms for ptychographic phase retrieval,” Inverse Problems and Applications, Contemp. Math 615, 261–280 (2014).
107. L. Loetgering, M. Rose, K. Keskinbora, M. Baluktsian, G. Dogan, U. Sanli, I. Bykova, M. Weigand, G. Schütz, and T. Wilhein, “Correction of axial position uncertainty and systematic detector errors in ptychographic diffraction imaging,” Opt. Eng. 57(08), 1 (2018). [CrossRef]
108. R. Röhrich, A. F. Koenderink, S. Witte, and L. Loetgering, “Spatial coherence control and analysis via micromirror-based mixed-state ptychography,” New J. Phys. 23(5), 053016 (2021). [CrossRef]
109. P. Li, T. Edo, D. Batey, J. Rodenburg, and A. Maiden, “Breaking ambiguities in mixed state ptychography,” Opt. Express 24(8), 9038 (2016). [CrossRef]
110. S. Dong, R. Shiradkar, P. Nanda, and G. Zheng, “Spectral multiplexing and coherent-state decomposition in Fourier ptychographic imaging,” Biomed. Opt. Express 5(6), 1757–1767 (2014). [CrossRef]
111. B. Zhang, D. F. Gardner, M. H. Seaberg, E. R. Shanblatt, C. L. Porter, R. Karl, C. A. Mancuso, H. C. Kapteyn, M. M. Murnane, and D. E. Adams, “Ptychographic hyperspectral spectromicroscopy with an extreme ultraviolet high harmonic comb,” Opt. Express 24(16), 18745 (2016). [CrossRef]
112. P. Song, R. Wang, J. Zhu, T. Wang, Z. Bian, Z. Zhang, K. Hoshino, M. Murphy, S. Jiang, and C. Guo, “Super-resolved multispectral lensless microscopy via angle-tilted, wavelength-multiplexed ptychographic modulation,” Opt. Lett. 45(13), 3486–3489 (2020). [CrossRef]
113. D. Goldberger, D. Schmidt, J. Barolak, B. Ivanic, C. G. Durfee, and D. E. Adams, “Spatiospectral characterization of ultrafast pulse-beams by multiplexed broadband ptychography,” Opt. Express 29(20), 32474–32490 (2021). [CrossRef]
114. Y. Yao, Y. Jiang, J. Klug, Y. Nashed, C. Roehrig, C. Preissner, F. Marin, M. Wojcik, O. Cossairt, and Z. Cai, “Broadband X-ray ptychography using multi-wavelength algorithm,” J. Synchrotron Radiat. 28(1), 309–317 (2021). [CrossRef]
115. M. Du, X. Liu, A. Pelekanidis, F. Zhang, L. Loetgering, P. Konold, C. L. Porter, P. Smorenburg, K. S. Eikema, and S. Witte, “High-resolution wavefront sensing and aberration analysis of multi-spectral extreme ultraviolet beams,” Optica 10(2), 255–263 (2023). [CrossRef]
116. J. M. Cowley and A. F. Moodie, “The scattering of electrons by atoms and crystals. I. A new theoretical approach,” Acta Crystallogr. 10(10), 609–619 (1957). [CrossRef]
117. T. M. Godden, R. Suman, M. J. Humphry, J. M. Rodenburg, and A. M. Maiden, “Ptychographic microscope for three-dimensional imaging,” Opt. Express 22(10), 12513 (2014). [CrossRef]
118. E. H. R. Tsai, I. Usov, A. Diaz, A. Menzel, and M. Guizar-Sicairos, “X-ray ptychography with extended depth of field,” Opt. Express 24(25), 29089 (2016). [CrossRef]
119. L. Tian and L. Waller, “3D intensity and phase imaging from light field measurements in an LED array microscope,” Optica 2(2), 104 (2015). [CrossRef]
120. T. Aidukas, “Next generation fourier ptychographic microscopy: computational and experimental techniques,” Ph.D. thesis, University of Glasgow (2021).
121. W. Xu, H. Lin, H. Wang, and F. Zhang, “Super-resolution near-field ptychography,” Opt. Express 28(4), 5164–5178 (2020). [CrossRef]
122. D. J. Batey, T. B. Edo, C. Rau, U. Wagner, Z. D. Pešić, T. A. Waigh, and J. M. Rodenburg, “Reciprocal-space up-sampling from real-space oversampling in x-ray ptychography,” Physical Review A - Atomic, Molecular, and Optical Physics 89(4), 043812 (2014). [CrossRef]
123. M. Beckers, T. Senkbeil, T. Gorniak, K. Giewekemeyer, T. Salditt, and A. Rosenhahn, “Drift correction in ptychographic diffractive imaging,” Ultramicroscopy 126, 44–47 (2013). [CrossRef]
124. P. Hessing, B. Pfau, E. Guehrs, M. Schneider, L. Shemilt, J. Geilhufe, and S. Eisebitt, “Holography-guided ptychography with soft X-rays,” Opt. Express 24(2), 1840 (2016). [CrossRef]
125. K. Patorski, “Fraunhofer diffraction patterns of tilted planar objects,” Opt. Acta 30(5), 673–679 (1983). [CrossRef]
126. E. S. R. Fonseca, P. T. Fiadeiro, M. Pereira, and A. Pinheiro, “Comparative analysis of autofocus functions in digital in-line phase-shifting holography,” Appl. Opt. 55(27), 7663–7674 (2016). [CrossRef]
127. H. Zhang, Z. Bian, S. Jiang, J. Liu, P. Song, and G. Zheng, “Field-portable quantitative lensless microscopy based on translated speckle illumination and sub-sampled ptychographic phase retrieval,” Opt. Lett. 44(8), 1976–1979 (2019). [CrossRef]
128. P. Thibault, M. Dierolf, O. Bunk, A. Menzel, and F. Pfeiffer, “Probe retrieval in ptychographic coherent diffractive imaging,” Ultramicroscopy 109(4), 338–343 (2009). [CrossRef]
129. X. Huang, H. Yan, R. Harder, Y. Hwu, I. K. Robinson, and Y. S. Chu, “Optimization of overlap uniformness for ptychography,” Opt. Express 22(10), 12634 (2014). [CrossRef]
130. G. A. Croes, “A Method for Solving Traveling-Salesman Problems,” Oper. Res. 6, 791–812 (1958). [CrossRef]
131. M. Guizar-Sicairos, I. Johnson, A. Diaz, M. Holler, P. Karvinen, H.-C. Stadler, R. Dinapoli, O. Bunk, and A. Menzel, “High-throughput ptychography using Eiger-scanning X-ray nano-imaging of extended regions,” Opt. Express 22(12), 14859 (2014). [CrossRef]
132. G. Cleuziou, “An extended version of the k-means method for overlapping clustering,” in 2008 19th International Conference on Pattern Recognition, (IEEE, 2008), pp. 1–4.
133. J. J. Whang, I. S. Dhillon, and D. F. Gleich, “Non-exhaustive, overlapping k-means,” in Proceedings of the 2015 SIAM international conference on data mining, (SIAM, 2015), pp. 936–944.
134. T. Aidukas, P. C. Konda, J. M. Taylor, and A. R. Harvey, “Multi-camera fourier ptychographic microscopy,” in Computational Optical Sensing and Imaging, (Optical Society of America, 2019), pp. CW3A—-4.
135. L. Loetgering, T. Aidukas, M. Du, D. B. Flaes, D. Penagos, M. Rose, A. Pelekanidis, A. De Beurs, W. Eschen, and J. Hess, Others, “ptyLab: a cross-platform inverse modeling toolbox for conventional and Fourier ptychography,” in Computational Optical Sensing and Imaging, (Optical Society of America, 2021), pp. CW6B—-3.
136. T. Aidukas, L. Loetgering, and A. R. Harvey, “Addressing phase-curvature in Fourier ptychography,” Opt. Express 30(13), 22421–22434 (2022). [CrossRef]
137. T. Aidukas, P. C. Konda, and A. R. Harvey, “High-speed multi-objective Fourier ptychographic microscopy,” Opt. Express 30(16), 29189–29205 (2022). [CrossRef]
138. W. Van Aarle, W. J. Palenstijn, J. Cant, E. Janssens, F. Bleichrodt, A. Dabravolski, J. De Beenhouwer, K. J. Batenburg, and J. Sijbers, “Fast and flexible X-ray tomography using the ASTRA toolbox,” Opt. Express 24(22), 25129–25147 (2016). [CrossRef]
139. A. Biguri, R. Lindroos, R. Bryll, H. Towsyfyan, H. Deyhle, I. khalil Harrane, R. Boardman, M. Mavrogordato, M. Dosanjh, and S. Hancock, “Arbitrarily large tomography with iterative algorithms on multiple GPUs using the TIGRE toolbox,” Journal of Parallel and Distributed Computing 146, 52–63 (2020). [CrossRef]
140. M. Odstrcil, M. Holler, J. Raabe, A. Sepe, X. Sheng, S. Vignolini, C. G. Schroer, and M. Guizar-Sicairos, “Ab initio nonrigid X-ray nanotomography,” Nat. Commun. 10(1), 2600 (2019). [CrossRef]
141. T. E. Oliphant, A guide to NumPy, vol. 1 (Trelgol Publishing USA, 2006).
142. J. D. Hunter, “Matplotlib: A 2D graphics environment,” Comput. Sci. Eng. 9(3), 90–95 (2007). [CrossRef]
143. “h5py,” https://www.h5py.org/.
144. P. Virtanen, R. Gommers, T. E. Oliphant, M. Haberland, T. Reddy, D. Cournapeau, E. Burovski, P. Peterson, W. Weckesser, and J. Bright, “SciPy 1.0: fundamental algorithms for scientific computing in Python,” Nat. Methods 17(3), 261–272 (2020). [CrossRef]
145. S. der Walt, J. L. Schönberger, J. Nunez-Iglesias, F. Boulogne, J. D. Warner, N. Yager, E. Gouillart, and T. Yu, “scikit-image: image processing in Python,” PeerJ 2, e453 (2014). [CrossRef]
146. C. O. da Costa-Luis, “tqdm: A fast, extensible progress meter for python and cli,” Journal of Open Source Software 4(37), 1277 (2019). [CrossRef]
147. J. Bezanson, A. Edelman, S. Karpinski, and V. B. Shah, “Julia: A fresh approach to numerical computing,” SIAM Rev. 59(1), 65–98 (2017). [CrossRef]
148. T. Besard, C. Foket, and B. De Sutter, “Effective Extensible Programming: Unleashing Julia on GPUs,” (2018).
149. “Cuda.jl,” https://github.com/JuliaGPU/CUDA.jl.
150. M. Frigo and S. G. Johnson, “FFTW: An adaptive software architecture for the FFT,” in Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP’98 (Cat. No. 98CH36181), vol. 3 (IEEE, 1998), pp. 1381–1384.
151. “FFTW.jl,” https://github.com/JuliaMath/FFTW.jl.
152. “LLVM.jl,” https://github.com/maleadt/LLVM.jl.