Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Amp-vortex edge-camera: a lensless multi-modality imaging system with edge enhancement

Open Access Open Access

Abstract

We demonstrate a lensless imaging system with edge-enhanced imaging constructed with a Fresnel zone aperture (FZA) mask placed 3 mm away from a CMOS sensor. We propose vortex back-propagation (vortex-BP) and amplitude vortex-BP algorithms for the FZA-based lensless imaging system to remove the noise and achieve the fast reconstruction of high contrast edge enhancement. Directionally controlled anisotropic edge enhancement can be achieved with our proposed superimposed vortex-BP algorithm. With different reconstruction algorithms, the proposed amp-vortex edge-camera in this paper can achieve 2D bright filed imaging, isotropic, and directional controllable anisotropic edge-enhanced imaging with incoherent light illumination, by a single-shot captured hologram. The effect of edge detection is the same as optical edge detection, which is the re-distribution of light energy. Noise-free in-focus edge detection can be achieved by using back-propagation, without a de-noise algorithm, which is an advantage over other lensless imaging technologies. This is expected to be widely used in autonomous driving, artificial intelligence recognition in consumer electronics, etc.

© 2023 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

The edge information represents the boundary information and the differential of an object or image, therefore, edge detection is a fundamental analytical tool in image processing and optical computing. It greatly reduces the abundant information and preserves important features of the object, so it plays an important role in various fields, especially in optical computing [1], intelligent recognition [24], biological imaging [58], astronomical observation [9,10], and fingerprint recognition [11].

Digital image processing and optical processing methods are often used to achieve edge enhancement. Digital image processing techniques are limited by image intensity information. For an image with damaged compression or low resolution like lensless measurement, it is difficult to obtain accurate and clear edge information by digital image processing. Optical processing can process the edge detection parallelly and without energy loss.

There are many optical systems developed to achieve edge enhancement, including differential interference contrast (DIC) [12], the spin hall effect [13,14] and the use of surface plasmon devices [15,16]. These methods can work in wide bandwidth with incoherent illumination, but they require additional optical elements for imaging or plasmon coupling, or polarization optical systems.

After Davis et al. proposed the theory of edge enhancement with the Hilbert transform, the spiral phase plate (SPP) [17] becomes a popular tool to generate vortex beam to realize the radial Hilbert transform, and it is also called vortex phase plate. The vortex phase plate extends the one-dimensional Hilbert transform to the radial space and ensures that the phase difference in any radial direction is π, and the isotropic edge enhancement can be obtained. Recently, various vortex filters have been proposed to improve the contrast or obtain selective edge enhancement, such as Laguerre-Gaussian spatial filter (LGSF) [18], Bessel-like amplitude filter (BLSF) [19], linear phase shifting spiral plate [20], Airy spiral phase filter [21], and Sinc filter [22]. However, the vortex optical methods often depend on the typical 4-f imaging system with the vortex plate placed at the Fourier plane and are restricted by strict lens imaging conditions [17].

Because of the high design freedom, metasurface plays an important role in compact edge detection optical systems. A single flat nanophotonic element or metamaterial is reported to be inserted in a traditional imaging system to achieve free space edge detection [23]. The system is still quite bulky and it is hard to operate on broadband. A hyperbolic phase is added into a spiral phase, to form a metasurface’s phase and get a compact edge detection kernel without a bulk 4-f system [24]. The impulse response of a metasurface is directly designed for compact edge detection imaging [25]. It is even expanded to multi-modality imaging, by employing specially designed metalens and different spin-state of the incident light [26], or by designing of switchable metasurface [25]. Although the imaging systems can be compact and the dielectric metasurfaces can be designed for broadband illumination, multi-modality imaging system needs specially designed mechanical switcher and anisotropic edge detection needs specially designed nano-optic elements. To date, there isn’t any report on broadband incoherent illumination, transmission, multi-modality with common imaging, edge detection and directional controllable anisotropic edge detection method.

The development of computational imaging [2,2731] is driving the miniaturization of cameras and shows the potential of multi-modality imaging. Innovative imaging systems are designed to simplify the camera and expand the function by adding computational power to imaging processing [32,33]. In the past few years, lensless imaging has become a research hotspot because it is lightweight, and easy to be implemented [34,35]. Recently, mask-based lensless imaging systems have been demonstrated to achieve lightweight by replacing the lens in the camera with an encoded mask [36]. The system obtains an image through computational reconstruction, which can improve the field of view and reduce the cost of the camera. Various mask cameras have been proposed like FlatCam [37,38], Fresnel zone plate aperture (FZA) camera [3942], diffractive gratings camera [43], PhlatCam [36,44], and DiffuserCam [33,45]. Various optimization algorithms are adopted to solve the inverse problem [41,4648], which takes a long time to converge and is difficult for real-time imaging. Although lensless imaging has great potential, it is still difficult to obtain the edge information of the image by digital image processing so far, due to either the severe noise of the reconstructed image with low resolution or the long reconstruction time.

Because of the relationship of a FZA and a point source hologram, the image pattern formed by a FZA-camera is just the same as an in-line hologram, and with incoherent illumination [39]. The twin noise in the images by a FZA-camera is dispersed noise with low energy, so it is possible to be suppressed when a Hilbert transform is performed by a vortex filter. In this paper, we implement a computational vortex phase filter in the reconstruction of the images by a FZA-camera to obtain edge enhancement. The noise preserved in the edge information may cause the heterogeneous energy distribution of edge information. The previous work with amplitude vortex phase filter inspires us in terms of removing noise in edge detection results.

In this paper, we propose a multi-modality lensless imaging system that can achieve 2D bright filed imaging, isotropic, and directional controllable anisotropic edge-enhanced imaging with incoherent illumination by a single-shot captured hologram, named amp-vortex edge-camera. The proposed camera is composed of a FZA mask and an image sensor, as shown in Fig. 1(a). FZA camera with compressive sensing (CS) has been proven for imaging [39] (Fig. 1(b)), but it is difficult for real-time reconstruction. There are many works on the FZA camera that suggest its great potential in re-focusing and super-resolved image [41,42,49,50]. We propose to add a vortex phase filter into back-propagation (BP), which is named Vortex-BP, to obtain edge-enhanced images (Fig. 1(c)). To further remove the twin-image noise, we propose to use amplitude vortex filter with BP, which is named amplitude vortex-BP (Fig. 1(d)). It is proved to be efficient to suppress the twin image and other noises in edge enhancement reconstruction and reach homogeneous edge patterns. Meanwhile, we propose an amplitude superposed vortex filter for directional edge enhancement. It can rapidly construct the isotropic edge enhancement of the image with superimposed vortex filters (Fig. 1(e)). The amp-vortex edge-camera can achieve both common imaging and edge-enhanced imaging with incoherent illumination. All of these reconstruction algorithms dispense with iterations, thus is potential for real-time reconstruction.

 figure: Fig. 1.

Fig. 1. Flowchart showing the imaging procedure of amp-vortex edge-camera. a. Schematic diagram of the proposed amp-vortex edge-camera. It can achieve multi-modality imaging with b. common imaging by compressive sensing algorithm, c. edge enhancement imaging by using vortex-BP algorithm, d. isotropic edge-enhanced imaging by using amp-vortex BP, and e. directional controllable anisotropic imaging by using superimposed vortex-BP, with a single-shot hologram.

Download Full Size | PDF

The amp-vortex edge-camera proposed in this paper is of great significance in both optical edge detection and lensless imaging. First, the effect of edge detection by the amp-vortex edge-camera is the same as the effect of optical edge detection, which is the re-distribution of light energy and is not affected by the original image. And the amp-vortex edge-camera can achieve multi-modalities including bright-fielding imaging, isotropic edge detection, and directional controllable anisotropic edge detection by a single-shot hologram with incoherent illumination. Second, the vortex-BP, amplitude vortex-BP, and superimposed vortex-BP reconstruction algorithms proposed in this paper can eliminate noise in BP and achieve noise-free reconstruction. They don’t need extra noise elimination algorithm, so the edge detection process is more efficient than bright-field imaging. This advantage of Amp-vortex edge-camera may open new doors of lensless camera in many areas like autonomous driving, artificial intelligence recognition in consumer electronics, and VR/AR.

2. Methods

2.1 Vortex-BP reconstruction

The expression of a vortex phase filter in the frequency domain can be expressed as:

$${S_1}(r,\theta ) = \exp (il\theta )circ\left( {\frac{r}{R}} \right)$$
where $(r,\theta )$ represent the polar coordinates in the frequency domain, R is the radius of the filter aperture, and l is the topological factor of the vortex filter. The changes of topological factor l will break the radial symmetry of the filter, thus will change the edge enhancement reconstruction result.

The modulation by a vortex phase filter ${S_1}(r,\theta )$ in frequency domain shown by:

$$\mathrm{{\cal F}}{O_{edge}} = {S_1}\mathrm{{\cal F}}O(x,y)$$
where $\mathrm{{\cal F}}$ is the Fourier transform operator, $O(x,y)$ is the image resorted on the sensor plane. The point spread function (PSF) of the vortex phase filter in the spatial domain is the inverse Fourier transform of ${S_1}(r,\theta )$:
$${s_1}(\rho ,\varphi ) ={-} i\exp (i\varphi )\pi R/2\rho [{{J_1}(x){H_0}(x) - {J_0}(x){H_1}(x)} ]$$
where $(\rho ,\varphi )$ represents the polar coordinate on the object plane, ${J_0}$ and ${J_1}$ are the Bessel functions of zero and first orders, ${H_0}$ and ${H_1}$ are Struve functions of zero and first orders, respectively, $x = \frac{{2\pi R\rho }}{{\lambda f}}$.

In FZA lensless system, the captured hologram is the superposition of all the FZA shadows. Thus, the basic forward model can be expressed as the convolution of the amplified object and the FZA shadows. Disregarding the constant term, a Gabor zone plate (GZP) can be decomposed as the superimposing of a pair of conjugated phase masks, i.e., mask $h(x,y)$ and mask ${h^\ast }(x,y)$, where $h(x,y) = \exp [{i\pi ({x^2} + {y^2})/r_1^2} ]$. The lensless imaging progress can be formulated as:

$$\begin{aligned} {I_{holo}}(x,y) &= O(x,y) \ast {T_G}(x,y) + e(x,y)\\& = \frac{1}{2}O(x,y) + \frac{1}{4}O(x,y) \ast h(x,y) + \frac{1}{4}O(x,y) \ast {h^\ast }(x,y) + e(x,y)\\& = C + \frac{1}{4}U(x,y) + \frac{1}{4}{U^\ast }(x,y) + e(x,y) \end{aligned}$$
where ${\ast} $ denotes the convolution. $O(x,y)$ is the scaled object on the sensor plane, and $e(x,y)$ is a random noise term. $C = \frac{1}{2}O(x,y)$ is constant, which could be removed by filtering out the direct current component. When the virtual wavelength $\lambda $ and virtual object distance d meet $r_1^2 = \lambda d$, $h(x,y)$ is consistent with the Fresnel kernel expression in traditional diffraction. $U(x,y)$ can be regarded as the diffracted wavefront of Fresnel propagation at $\lambda $ and d. And ${U^\ast }(x,y)$ is the conjugate wave of $U(x,y)$. It yields a defocus image, which is the so called twin image. The forward model can be rewritten as:
$${I_{holo}} = \frac{1}{2}{\mathrm{{\cal F}}^{ - 1}}H\mathrm{{\cal F}}O + \frac{1}{2}{\mathrm{{\cal F}}^{ - 1}}{H^\ast }\mathrm{{\cal F}}O + e$$

$\mathrm{{\cal F}}$ and ${\mathrm{{\cal F}}^{ - 1}}$ are the Fourier transform operator and the inverse Fourier transform operator, respectively, and H indicates the transfer function operator. The vortex filter works in the frequency domain, thus the hologram is converted to the frequency domain and multiplied by the vortex filter:

$$\begin{aligned} {S_1}\mathrm{{\cal F}}{I_{holo}} &= \frac{1}{4}{S_1}H\mathrm{{\cal F}}O + \frac{1}{4}{S_1}{H^\ast }\mathrm{{\cal F}}O + E\\& = \frac{1}{4}H\mathrm{{\cal F}}{O_{edge}} + \frac{1}{4}{H^\ast }\mathrm{{\cal F}}{O_{edge}}\textrm{ + }E\\& = \frac{1}{2}{\textrm{Re}} \{H \}\mathrm{{\cal F}}{O_{edge}} + E \end{aligned}$$

${O_{edge}}$ is the edge enhancement of the original image O, and E is the expression of the error in the frequency domain. Since ${I_{holo}}$ is a real value, and H is a central symmetry function, we get the following solution for the edge reconstruction:

$${O_{edge}} = 2{\mathrm{{\cal F}}^{ - 1}}{S_1}{\textrm{Re}} \left\{ {\frac{{\mathrm{{\cal F}}{I_{holo}}}}{H}} \right\} + e$$

The final edge enhancement result is the intensity value ${|{{O_{edge}}} |^2}$. Usually, the vortex filter is applied in a 4-f imaging system, involving two lenses with strict relative locations. In our case, we convert it into a virtual computational filter in BP.

Although the vortex-BP can achieve edge enhancement in FZA lensless camera, the reconstruction is disturbed by twin image. To solve this problem, the amplitude terms should be integrated into the edge enhancement reconstruction to obtain clear edge information. For natural images, the inherent defocus twin image in lensless reconstruction is a diffuse pattern. Therefore, it can be classified as low-frequency noise. Exactly, amplitude vortex filters inspire us with complex amplitude modulation.

2.2 Isotropic edge enhancement by amplitude vortex-BP

Amplitude vortex phase filters are bandpass filters in amplitude that can filter out low and some high-frequency noise. In lensless imaging, the noise in zero frequency and low frequency is mainly the twin image, while in high frequency is mainly the background noise. Therefore, we adopt four amplitude vortex filters, Laguerre-Gaussian (LG), Bessel, Airy, and Sinc, to enhance edge reconstruction performance. The transmission function of LG filter is:

$$\begin{aligned} {S_{LG}}(r,\theta ) &= \left( {\frac{r}{{{\omega_1}}}} \right)\exp \left[ { - {{\left( {\frac{r}{{{\omega_1}}}} \right)}^2}} \right]circ\left( {\frac{r}{R}} \right)\exp (il\theta )\\& = {A_{LG}}\exp (il\theta )circ\left( {\frac{r}{R}} \right) \end{aligned}$$
where ${A_{LG}} = \left( {\frac{r}{{{\omega_1}}}} \right)\exp \left[ { - {{\left( {\frac{r}{{{\omega_1}}}} \right)}^2}} \right]$ is regarded as the LG amplitude term, ${\omega _1}$ is the parameter that controls the maximum amplitude position. In the same way, the amplitude vortex phase filters have the same form of one amplitude term that multiplies the vortex filter term, like $S(r,\theta ) = A\exp (il\theta )circ\left( {\frac{r}{R}} \right)$. Thus, the amplitude term of the Bessel is ${A_{Bessel}} = \frac{{{J_2}(\alpha r)}}{r}$, where ${J_2}(\alpha \rho )$ is the Bessel function of the second order; the amplitude term of the Airy is ${A_{Airy}} = Ai\left( {\frac{{{r_0} - r}}{{{\omega_0}}}} \right)\exp \left( {\frac{{{r_0} - r}}{{{\omega_0}}}} \right)$, $Ai()$ is the Airy function, ${\omega _0}$ is the adjustment parameter and ${r_0}$ is the main ring radius of the Airy function, and the amplitude term of the Sinc term is ${A_{Sin c}} = \sin c(\alpha r)\sin (\alpha \pi r)$, $Sin c()$ is the sinc function and a is the parameter. Since the size of the bandpass vortex filter is adjusted by parameters, to guarantee accurate edge enhancement, the optimization is carried out according to the parameters.

2.3 Anisotropic edge enhancement by superimposed vortex-BP

Superimposing two vortex phase filters with different phases in BP [51], anisotropic edge enhancement can be achieved by amp-vortex edge-camera. The transmittance function of the superimposed vortex phase filter can be given by:

$${S_2} = \exp (i{l_1}(\theta + \beta )) + c\exp ( - i{l_2}(\theta + \beta ))$$
where ${l_1}$ and ${l_2}$ are the two topological factors, $\beta $ is the initial phase, and c is the weighting factor. Factor c determines the weight ratio of the positive and negative vortex and controls the edge enhancement power. A circle pattern is employed as the sample to illustrate directionally enhancement effects. This BP algorithm is named as superimposed vortex-BP.

According to the PSF amplitude distribution (Fig. 2(a)), the vortex phase filter is radially symmetric and the main ring is continuously and uniformly distributed, thus the edge enhancement of the target object is isotropic. The superimposed vortex is composed of two opposite vortex phase filters [52], the edge enhancement with it can be anisotropic but still radially symmetric.

 figure: Fig. 2.

Fig. 2. PSF of (a) a single vortex phase filter, and (b) a superimposed vortex phase filter.

Download Full Size | PDF

In Eq. (9), factor c determines the weight ratio of the positive and negative vortex and controls the edge enhancement power. When $c = 1$, the transmission function can be rewritten as:

$${S_2} = \exp (i{l_1}(\theta + \beta )) + c\exp ( - i{l_2}(\theta + \beta )) = 2\cos (\theta + \beta )$$

By inverse Fourier transform, the PSF of the superimposed vortex filter is:

$${s_2}(\rho ,\varphi ) ={-} i\frac{{\pi R}}{\rho }[{{J_1}(x){H_0}(x) - {J_0}(x){H_1}(x)} ]\sin (\varphi + \beta )$$

The PSFs of the vortex filter and superimposed vortex filter are simulated in Fig. 2 for comparison. The figures show that the vortex filter is isotropic, while the superimposed vortex is anisotropic because of the $\sin (\varphi + \beta )$ term. Thus, this nature results in directional edge enhancement, and the orientation of edge enhancement is controlled by the initial phase $\beta $.

The amplitude vortex terms are applied and the corresponding filter functions are:

$${S_{LG2}} = \frac{r}{{{\omega _1}}}\exp \left[ { - {{\left( {\frac{r}{{{\omega_1}}}} \right)}^2}} \right][{c\exp (i\theta ) + \exp ( - i(\theta + \beta ))} ]$$
$${S_{Bessel2}} = \left[ {\frac{{{J_2}(\alpha r)}}{r}} \right][{c\exp (i\theta ) + \exp ( - i(\theta + \beta ))} ]$$
$${S_{Airy2}} = Ai(\frac{{{r_0} - r}}{{{\omega _0}}})[{c\exp (i\theta ) + \exp ( - i(\theta + \beta ))} ]$$
$${S_{Sin c2}} = \sin c(\alpha r)\sin (\alpha \pi r)[{c\exp (i\theta ) + \exp ( - i(\theta + \beta ))} ]$$

The amplitude vortex terms are independent of the azimuthal angle and symmetrical about the center. Therefore, amplitude vortex terms don’t affect the radial symmetry of the PSF, so it can be used in the superposition filter to enhance the performance.

3. Experimental results

To evaluate the performance of the amp-vortex edge-camera, we construct the camera prototype using a QHY163M COMS sensor with 3.78 µm pixel pitch and an FZA mask fixed on the sensor with tape (Fig. 1(c)). The transmission function of the FZA mask is:

$${T_F}(r) = \frac{1}{2} + \frac{1}{2}{\mathop{\rm sgn}} \left[ {\cos \left( {\frac{{\pi {r^2}}}{{r_1^2}}} \right)} \right]$$
where ${r_1}$ is the radius of the innermost region. The mask has a radius of 4.56 mm, ${r_1}$ and is chosen as 0.32 mm. It is fabricated on a soda lime glass substrate with a thickness of 2 mm using laser direct writing technique. The mask is shown in Fig. 1(a). For the convenience of derivation, the continuous Gabor zone plate (GZP) is used to replace the binary FZA:
$${T_G}(r) = \frac{1}{2} + \frac{1}{2}\cos \left( {\frac{{\pi {r^2}}}{{r_1^2}}} \right)$$

The distance between the mask and the sensor is 3 mm. Binary and gray-level images with the size of 200 mm × 200 mm are displayed on a tablet screen (Matepad 11, HUAWEI) that is 300 mm away from the mask. The sensor collects the light rays from the tablet. The imaging model can be considered as a case of pinhole imaging. A pinhole camera zooms the scene of $M = {z_2}/{z_1} \approx 0.01$, where ${z_2}$ is the distance between the mask and the sensor, and ${z_1}$ is the distance between the scene and the mask. In this case, the constant ${r_1}$ at the sensor plane is ${r_i} = \frac{{{r_1}}}{{1 + M}}$.

3.1 Isotropic edge enhancement

Vortex phase filter and amplitude vortex filters including LG filter, Bessel filter, Airy filter, and Sinc filter, are used to reconstruct the edge-enhancement images from the recorded images. The recorded images are the same as the inline holograms [39] and the edge enhancement reconstruction results are shown in Fig. 3. An LCD monitor and an LED flashlight are used as the incoherent illumination light. We test simple and complex images and an actual scene of a rubber duck. The images are displayed on the screen, while the rubber duck is illuminated by the flashlight. The monitor and the duck are placed at 300 mm from the mask. The captured holograms are cropped to 2400 × 2400 for subsequent processing. Image contrast is used to evaluate the edge enhancement effect. The average contrast from four different points in the reconstructed image is employed, which is defined as:

$$C = \frac{{{{\overline I }_{\max }} - {{\overline I }_{\min }}}}{{{{\overline I }_{\max }} + {{\overline I }_{\min }}}}$$
where ${\overline I _{\max }}$ and ${\overline I _{\min }}$ represent the average of the maximum and minimum pixel values. The results in Fig. 3 are isotropic edge patterns of the images and the actual scene with high contrast. The edge detection results by amp-vortex BP (labeled by LG, Bessel, Airy and Sinc in Fig. 3) have higher contrast values compared to the results by vortex-BP (labeled by Spiral in Fig. 3). The energy of the edges of the in-focus objects in the images is larger than the energy of the edges of the defocused objects, which is an obvious advantage in some application areas, such as autonomous driving, artificial intelligence recognition in consumer electronics and so on.

 figure: Fig. 3.

Fig. 3. The edge enhancement reconstructions from the holograms by using vortex-BP (Spiral) and amp-vortex BP (LG, Bessel, Airy and Sinc). The contrast C and intersecting lines are shown in the results.

Download Full Size | PDF

To compare our amp-vortex edge-camera with the conventional post-processing method and lens camera, edge enhancement results of gestures are reconstructed in different ways. The hand is placed 500 mm from the mask, and illuminated by a white LED lamp (Daheng, GCI-060411). Here Canny operator is utilized as a reference. Because the four kinds of amplitude vortex reconstructions are all pretty good, here we take the results by LG-BP as an illustration, and the reconstruction results are shown in Fig. 4.

 figure: Fig. 4.

Fig. 4. The imaging and edge enhancement results of hand at the 500 mm distance from the mask. (a1)lens camera image and (a2) Canny operator; (b1) lensless camera image with CS and (b2) LG BP;(c1)FZA lensless camera image with BP and (c2) Canny Operator; (d1) lensless camera image with BP and (d2) amp-vortex edge-camera with LG BP.

Download Full Size | PDF

Compared to digital image processing, the main advantage of the vortex filter method is no loss of image intensity. The first column image (a1) is the BP results of the FZA lensless camera, and Fig. (a2) is the corresponding edge enhancement with BP and Canny operator method. CS is demonstrated to remove the noise in the FZA camera, the result of which is shown in Fig. 4(b1). Here CS with 100 iterations is utilized as a reference using the same LG filter for edge enhancement, shown in Fig. 4(b2). Figure 4(d1) and (d2) are the results of BP and LG-BP, respectively. Compared with (a2), the proposed amp-vortex edge-camera with LG vortex performs much better in edge enhancement, while the imaging reconstructions are the same. It shows that the amplitude vortex filter suppresses noise in lensless reconstruction. While Fig. 4 (d2) performs almost the same reconstruct quality as the amp-vortex with CS in Fig. 4 (b2), but the average computing time is 0.49s, while the other needs about 15 minutes.

Figure 4(c1) and (c2) show the picture taken by a lens camera (a cellphone, XiaoMi 9) and the edge extracted by the Canny operator, respectively. There are many defocus noises in Fig. 4(c2), the energy of which is even larger than the edge of the hand. In the contract, there is no such defocus noise in Fig. 4(d2) by the proposed amp-vortex edge-camera, because the PSFs of the lensless camera vary with the object distance [36,37,39]. The proposed amp-vortex edge-camera can acquire distinct in-focus edge-enhancing results effectively, which is a compact, efficient, and low-cost edge camera technology.

In Fig. 4(d2), the edge of the arm is classified as an out-of-focus image, so the enhancement corresponds to a low local contrast. To some extent, it can be controlled by adjusting the amplitude term parameter. We add a series of experiments of tow images 300 mm. There are two images of duck displayed at 100 mm and 400 mm distance from the mask in a scene, respectively. The results of edge enhancement at different distances, reconstructed with LG-BP, are shown in Fig. 5.

 figure: Fig. 5.

Fig. 5. Refocus edge enhancement experiment results with LG BP. By changing the FZA constant in reconstruction, focus plane could be selectively changed. (a) Object distances. (b) Plane at 100 mm. (c) Plane at 400 mm.

Download Full Size | PDF

Figure 5 (b) and (c) show the refocus results with LG-BP at different distance of 100 and 400 mm, respectively, as illustrated in Fig. 5(a). Figure 5(b) is approximately focused at the distance of 100 mm, while Fig. 5(c) is focused on the duck on the left.

3.2 Anisotropic edge enhancement

A circle pattern is employed as the sample to illustrate directionally enhancement effects. In Fig. 6, we show that the direction of the edge enhancement can be controlled by adjusting the initial phase $\beta $ and the weighting factor c. The effect of topological factor will be discussed in section 3.3.

 figure: Fig. 6.

Fig. 6. Anisotropic edge enhancement results of a circle with different $\beta $ and c by Superimposed Vortex-BP. $c = 0$ and values of $\beta $ are (a) 0, (b) $\pi /4$, (c) $\pi /2$, and (d) $3\pi /4$. $\beta = 0$ and values of c are (e) 0, (f) 0.3, (g) 0.7, (h) 1, and (i) shows the intensity section distributions of (a2)-(d2) along the colorful lines.

Download Full Size | PDF

When c is set as 1, $\beta $ is set as $0$, $\pi /4$, $\pi /2$, and $3\pi /4$, the reconstruction results are shown in Fig. 6(a), (b), (c) and (d), respectively. All results are reconstructed with the LG vortex filter by the proposed Superimposed Vortex-BP. It is demonstrated that, when the initial phase $\varphi $ is set at different angle values, the edge enhancement can be controlled along different directions.

When $\beta $ is set as 0 and the weighting factor c is changed from 0 to 1, the vertical directions of edges are extracted in varying power. The results are shown in Fig. 6(b), (f), (g), and (h), respectively, and the corresponding intensity section distributions of the same edge along the colorful lines are shown in Fig. 6(i).

The results in Fig. 6 verify that the proposed superimposed vortex-BP can detect the edge information along different directions, by simply adjusting the initial phase $\beta $ or the weighting factor c.

3.3 Topological factor

The edge enhancement results with different l values and the intensity along the color lines in the results are shown in Fig. 7. The edge enhancement result with $l = 1$ shown in Fig. 7(d) is taken as the reference. The intensity along the green line in Fig. 7(d) has only one peak, shown as a green line in Fig. 7(e) and (f). The peak indicates the location of the edge. When the value of l is other than 1, the peak is divided into two peaks. When $l = 0 \sim 1$, as l decreases, the peak gradually decreases, and the other peak appears to the left of the original peak. Both of the two peaks gradually shift to the right till a complete inversion of peak and valley appears, and the valley indicates the location of the edge when $l = 0$, shown as the red line in Fig. 7(e). When l changes from 1 to 2, a similar phenomenon can be observed but the peaks gradually shift to the left, and the inversion of peak and valley appears when $l = 2$ . This phenomenon is beneficial to the application scenes where the observation of the inside and outside of the edge is crucial, e.g., fingerprint identification and observation of biological tissues.

 figure: Fig. 7.

Fig. 7. Edge enhancement experiment results of a circle with different l. Values of l are (a) 0, (b) 0.4, (c) 0.8, (d) 1, (f) 1, (g) 1.4, (h) 1.8, (i) 2. (e) and (j) are the normalized intensity along the colorful lines in (a)-(d) and (f)-(i).

Download Full Size | PDF

4. Conclusion

We propose a lensless multi-modality imaging system using a computational vortex phase filter with the FZA mask acting as a Fresnel hologram encoder, named as amp-vortex edge-camera. Conventional and deep learning reconstruction algorithms, such as CS [39], ADMM (alternating direction method of multipliers) [53], and DNN (deep neural network) [54], can achieve 2D imaging reconstruction. The vortex phase filter in the backpropagation paves the way for edge enhancement. Amplitude vortex filters provide a guarantee for noise suppression, which enhances the reconstruction contrast. Moreover, computationally superimposed vortex filters enable direction-selective edge enhancement. With different reconstruction algorithms, the proposed amp-vortex edge-camera can realize 2D imaging, isotropic, and directional controllable anisotropic edge-enhanced imaging with incoherent illumination, by a single-shot captured hologram.

Because direct BP with different filters is applied to reconstruct high-quality edge-enhanced images without iteration, this method enables real-time processing. If the lensless edge enhancement algorithm, including amplitude vortex-BP and superimposed vortex-BP algorithm, are transplanted to a dedicated on-chip system, the proposed amp-vortex edge-camera can achieve faster and lower power lost edge image reconstruction than lens imaging. Although computation is needed for reconstruction, noise-free reconstruction can be achieved by using direct BP, without a de-noise algorithm. The effect of edge detection is the same as optical edge detection, which is the re-distribution of light energy. Combined with the phase recovery algorithm, this technology can be applied to edge enhancement of more phase objects. This technology can be applied to many scenarios, such as license plate recognition, vehicle counting, and industrial inspection.

Funding

National Key Research and Development Program of China (Grant No.2021YFB2802004).

Acknowledgments

Thanks to Prof. Shuming Jiao in Pengcheng Laboratory for the useful discussion in optical computing.

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. T. Badloe, S. Lee, and J. Rho, “Computation at the speed of light: Metamaterials for all-optical calculations and neural networks,” Adv. Photon. 4(06), 064002 (2022). [CrossRef]  

2. Y. Altmann, S. McLaughlin, M. J. Padgett, V. K. Goyal, A. O. Hero, and D. Faccio, “Quantum-inspired computational imaging,” Science 361(6403), eaat2298 (2018). [CrossRef]  

3. P. Fua and Y. G. Leclerc, “Model driven edge detection,” Machine Vis. Apps. 3(1), 45–56 (1990). [CrossRef]  

4. R.-G. Zhou, H. Yu, Y. Cheng, and F.-X. Li, “Quantum image edge extraction based on improved prewitt operator,” Quantum Inf. Process. 18(9), 261 (2019). [CrossRef]  

5. W. Cao, Y. Zhou, C. L. P. Chen, and L. Xia, “Medical image encryption using edge maps,” Signal Processing 132, 96–109 (2017). [CrossRef]  

6. F. Orujov, R. Maskeliunas, R. Damasevicius, and W. Wei, “Fuzzy based image edge detection algorithm for blood vessel detection in retinal images,” Appl. Soft Comupt. 94, 106452 (2020). [CrossRef]  

7. Z. Cui, C. Li, Z. Du, N. Chen, G. Wei, R. Chen, L. Yang, D. Shen, and W. Wang, “Structure-driven unsupervised domain adaptation for cross-modality cardiac segmentation,” IEEE Trans. Med. Imaging 40(12), 3604–3616 (2021). [CrossRef]  

8. H. Huang, L. Lin, R. Tong, H. Hu, Q. Zhang, Y. Iwamoto, X. Han, Y.-W. Chen, and J. Wu, and Ieee, “WNET: An end-to-end atlas-guided and boundary-enhanced network for medical image segmentation,” in 2020 IEEE 17th International Symposium on Biomedical Imaging (ISBI), (2020), 763–766.

9. M. J. Durbin, R. L. Beaton, J. J. Dalcanton, B. F. Williams, and M. L. Boyer, “MCR-TRGB: A multiwavelength-covariant, robust tip of the red giant branch measurement method*,” Astrophys J. 898(1), 57 (2020). [CrossRef]  

10. Y. Zhang, J. Jiang, G. Zhang, and Y. Lu, “Accurate and robust synchronous extraction algorithm for star centroid and nearby celestial body edge,” IEEE Access 7, 126742–126752 (2019). [CrossRef]  

11. Y. Xu, G. Lu, Y. Lu, and D. Zhang, “High resolution fingerprint recognition using pore and edge descriptors,” Pattern Recognit. Lett. 125, 773–779 (2019). [CrossRef]  

12. Y. Wang, Q. Yang, S. He, R. Wang, and H. Luo, “Computing metasurfaces enabled broad-band vectorial differential interference contrast microscopy,” ACS Photonics (2022).

13. T. Zhu, Y. Lou, Y. Zhou, J. Zhang, J. Huang, Y. Li, H. Luo, S. Wen, S. Zhu, Q. Gong, M. Qiu, and Z. Ruan, “Generalized spatial differentiation from the spin hall effect of light and its application in image processing of edge detection,” Phys. Rev. Applied 11(3), 034043 (2019). [CrossRef]  

14. T. Zhu, C. Guo, J. Huang, H. Wang, M. Orenstein, Z. Ruan, and S. Fan, “Topological optical differentiator,” Nat. Commun. 12(1), 680 (2021). [CrossRef]  

15. N. Yu, P. Genevet, M. A. Kats, F. Aieta, J.-P. Tetienne, F. Capasso, and Z. Gaburro, “Light propagation with phase discontinuities: Generalized laws of reflection and refraction,” Science 334(6054), 333–337 (2011). [CrossRef]  

16. Z. Wang, G. Hu, X. Wang, X. Ding, K. Zhang, H. Li, S. N. Burokur, Q. Wu, J. Liu, J. Tan, and C.-W. Qiu, “Single-layer spatial analog meta-processor for imaging processing,” Nat. Commun. 13(1), 2188 (2022). [CrossRef]  

17. J. A. Davis, D. E. McNamara, D. M. Cottrell, and J. Campos, “Image processing with the radial hilbert transform: Theory and experiments,” Opt. Lett. 25(2), 99–101 (2000). [CrossRef]  

18. C.-S. Guo, Y.-J. Han, J.-B. Xu, and J. Ding, “Radial hilbert transform with laguerre-gaussian spatial filters,” Opt. Lett. 31(10), 1394–1396 (2006). [CrossRef]  

19. S. B. Wei, S. W. Zhu, and X. C. Yuan, “Image edge enhancement in optical microscopy with a bessel-like amplitude modulated spiral phase filter,” J. Opt. 13(10), 105704 (2011). [CrossRef]  

20. M. K. Sharma, J. Joseph, and P. Senthilkumaran, “Selective edge enhancement using shifted anisotropic vortex filter,” J. Opt-India. 42(1), 1–7 (2013). [CrossRef]  

21. Y. Zhou, S. Feng, S. Nie, J. Ma, and C. Yuan, “Image edge enhancement using airy spiral phase filter,” Opt. Express 24(22), 25258–25268 (2016). [CrossRef]  

22. G. Zhenhong, W. Yongjun, H. Lu, L. Chao, and X. Xiangjun, “Image edge enhancement technique using a novel optical vortex filtering,” in 2021 19th International Conference on Optical Communications and Networks(ICOCN), (2021), 1–3.

23. Y. Zhou, H. Zheng, I. I. Kravchenko, and J. Valentine, “Flat optics for image differentiation,” Nat. Photonics 14(5), 316–323 (2020). [CrossRef]  

24. Y. Kim, G.-Y. Lee, J. Sung, J. Jang, and B. Lee, “Spiral metalens for phase contrast imaging,” Adv. Funct. Mater. 32(5), 2106050 (2022). [CrossRef]  

25. X. Zhang, Y. Zhou, H. Zheng, A. E. Linares, F. C. Ugwu, D. Li, H.-B. Sun, B. Bai, and J. G. Valentine, “Reconfigurable metasurface for image processing,” Nano Lett. 21(20), 8715–8722 (2021). [CrossRef]  

26. P. Huo, C. Zhang, W. Zhu, M. Liu, S. Zhang, S. Zhang, L. Chen, H. J. Lezec, A. Agrawal, Y. Lu, and T. Xu, “Photonic spin-multiplexing metasurface for switchable spiral phase contrast imaging,” Nano Lett. 20(4), 2791–2798 (2020). [CrossRef]  

27. X. Shao, F. Liu, W. Li, L. Yang, S. Yang, and J. Liu, “Latest progress in computational imaging technology and application,” Laser Optoelectron. Prog. 57(2), 020001 (2020). [CrossRef]  

28. X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: Sensing, display, and applications invited,” Appl. Opt. 52(4), 546–560 (2013). [CrossRef]  

29. D. J. Brady, M. E. Gehm, R. A. Stack, D. L. Marks, D. S. Kittle, D. R. Golish, E. M. Vera, and S. D. Feller, “Multiscale gigapixel photography,” Nature 486(7403), 386–389 (2012). [CrossRef]  

30. C. Saunders, J. Murray-Bruce, and V. K. Goyal, “Computational periscopy with an ordinary digital camera,” Nature 565(7740), 472–475 (2019). [CrossRef]  

31. Y. Tao, X. Wang, G. Yan, and F. Yang, “Computational ghost imaging method based on tikhonov regularization,” Laser Optoelectron. Prog. 57(2), 021016 (2020). [CrossRef]  

32. Z. Yu and X. Sun, “Acousto-optic modulation of photonic bound state in the continuum,” Light: Sci. Appl. 9(1), 1 (2020). [CrossRef]  

33. K. Monakhova, K. Yanny, N. Aggarwal, and L. Waller, “Spectral diffusercam: Lensless snapshot hyperspectral imaging with a spectral filter array,” Optica 7(10), 1298–1307 (2020). [CrossRef]  

34. V. Boominathan, J. T. Robinson, L. Waller, and A. Veeraraghavan, “Recent advances in lensless imaging,” Optica 9(1), 1–16 (2022). [CrossRef]  

35. V. Boominathan, J. K. Adams, M. S. Asif, B. W. Avants, J. T. Robinson, R. G. Baraniuk, A. C. Sankaranarayanan, and A. Veeraraghavan, “Lensless imaging a computational renaissance,” IEEE Signal Process. Mag. 33(5), 23–35 (2016). [CrossRef]  

36. V. Boominathan, J. K. Adams, J. T. Robinson, and A. Veeraraghavan, “Phlatcam: Designed phase-mask based thin lensless camera,” IEEE Trans. Pattern Anal. Mach. Intell. 42(7), 1618–1629 (2020). [CrossRef]  

37. M. S. Asif, A. Ayremlou, A. Sankaranarayanan, A. Veeraraghavan, and R. G. Baraniuk, “Flatcam: Thin, lensless cameras using coded aperture and computation,” IEEE Trans. Comput. Imaging 3(3), 384–397 (2017). [CrossRef]  

38. J. K. Adams, V. Boominathan, B. W. Avants, D. G. Vercosa, F. Ye, R. G. Baraniuk, J. T. Robinson, and A. Veeraraghavan, “Single-frame 3d fluorescence microscopy with ultraminiature lensless FlatScope,” Sci. Adv. 3(12), e1701548 (2017). [CrossRef]  

39. J. Wu, H. Zhang, W. Zhang, G. Jin, L. Cao, and G. Barbastathis, “Single-shot lensless imaging with Fresnel zone aperture and incoherent illumination,” Light: Sci. Appl. 9(1), 53 (2020). [CrossRef]  

40. T. Shimano, Y. Nakamura, K. Tajima, M. Sao, and T. Hoshizawa, “Lensless light-field imaging with Fresnel zone aperture: Quasi-coherent coding,” Appl. Opt. 57(11), 2841–2850 (2018). [CrossRef]  

41. J. Wu, L. Cao, and G. Barbastathis, “DNN-FZA camera: A deep learning approach toward broadband FZA lensless imaging,” Opt. Lett. 46(1), 130–133 (2021). [CrossRef]  

42. Y. Ma, J. Wu, S. Chen, and L. Cao, “Explicit-restriction convolutional framework for lensless imaging,” Opt. Express 30(9), 15266–15278 (2022). [CrossRef]  

43. D. G. Stork and P. R. Gill, “Optical, mathematical, and computational foundations of lensless ultra-miniature diffractive imagers and sensors,” Int. J. Adv. Syst. Meas. 7(3), 201–208 (2014).

44. J. K. Adams, D. Yan, J. Wu, V. Boominathan, S. Gao, A. V. Rodriguez, S. Kim, J. Carns, R. Richards-Kortum, C. Kemere, A. Veeraraghavan, and J. T. Robinson, “In vivo lensless microscopy via a phase mask generating diffraction patterns with high-contrast contours,” Nat. Biomed. Eng 6(5), 617–628 (2022). [CrossRef]  

45. N. Antipa, G. Kuo, R. Heckel, B. Mildenhall, E. Bostan, R. Ng, and L. Waller, “Diffusercam: Lensless single-exposure 3d imaging,” Optica 5(1), 1–9 (2018). [CrossRef]  

46. W. Shi, Z. Huang, H. Huang, C. Hu, M. Chen, S. Yang, and H. Chen, “LOEN: Lensless opto-electronic neural network empowered machine vision,” Light: Sci. Appl. 11(1), 121 (2022). [CrossRef]  

47. S. S. Khan, V. Sundar, V. Boominathan, A. Veeraraghavan, and K. Mitra, “Flatnet: Towards photorealistic scene reconstruction from lensless measurements,” IEEE Trans. Pattern Anal. Mach. Intell. 44(4), 1 (2020). [CrossRef]  

48. K. Monakhova, J. Yurtsever, G. Kuo, N. Antipa, K. Yanny, and L. Waller, “Learned reconstructions for practical mask-based lensless imaging,” Opt. Express 27(20), 28075–28090 (2019). [CrossRef]  

49. T. Nakamura, T. Watanabe, S. Igarashi, X. Chen, K. Tajima, K. Yamaguchi, T. Shimano, and M. Yamaguchi, “Superresolved image reconstruction in FZA lensless camera by color-channel synthesis,” Opt. Express 28(26), 39137–39155 (2020). [CrossRef]  

50. X. Chen, X. X. Pan, T. Nakamura, S. Takeyama, T. Shimano, K. Tajima, and M. Yamaguchi, “Wave-optics-based image synthesis for super resolution reconstruction of a FZA lensless camera,” Opt. Express 31(8), 12739–12755 (2023). [CrossRef]  

51. Z. Gu, D. Yin, S. Nie, S. Feng, F. Xing, J. Ma, and C. Yuan, “High-contrast anisotropic edge enhancement free of shadow effect,” Appl. Opt. 58(34), G351–G357 (2019). [CrossRef]  

52. M. K. Sharma, J. Joseph, and P. Senthilkumaran, “Directional edge enhancement using superposed vortex filter,” Opt. Laser Technol. 57, 230–235 (2014). [CrossRef]  

53. S. Boyd, N. Parikh, E. Chu, B. Peleato, and J. Eckstein, “Distributed optimization and statistical learning via the alternating direction method of multipliers,” Found. Trends Mach. Le. 3(1), 1–122 (2010). [CrossRef]  

54. G. Barbastathis, A. Ozcan, and G. Situ, “On the use of deep learning for computational imaging,” Optica 6(8), 921–943 (2019). [CrossRef]  

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. Flowchart showing the imaging procedure of amp-vortex edge-camera. a. Schematic diagram of the proposed amp-vortex edge-camera. It can achieve multi-modality imaging with b. common imaging by compressive sensing algorithm, c. edge enhancement imaging by using vortex-BP algorithm, d. isotropic edge-enhanced imaging by using amp-vortex BP, and e. directional controllable anisotropic imaging by using superimposed vortex-BP, with a single-shot hologram.
Fig. 2.
Fig. 2. PSF of (a) a single vortex phase filter, and (b) a superimposed vortex phase filter.
Fig. 3.
Fig. 3. The edge enhancement reconstructions from the holograms by using vortex-BP (Spiral) and amp-vortex BP (LG, Bessel, Airy and Sinc). The contrast C and intersecting lines are shown in the results.
Fig. 4.
Fig. 4. The imaging and edge enhancement results of hand at the 500 mm distance from the mask. (a1)lens camera image and (a2) Canny operator; (b1) lensless camera image with CS and (b2) LG BP;(c1)FZA lensless camera image with BP and (c2) Canny Operator; (d1) lensless camera image with BP and (d2) amp-vortex edge-camera with LG BP.
Fig. 5.
Fig. 5. Refocus edge enhancement experiment results with LG BP. By changing the FZA constant in reconstruction, focus plane could be selectively changed. (a) Object distances. (b) Plane at 100 mm. (c) Plane at 400 mm.
Fig. 6.
Fig. 6. Anisotropic edge enhancement results of a circle with different $\beta $ and c by Superimposed Vortex-BP. $c = 0$ and values of $\beta $ are (a) 0, (b) $\pi /4$, (c) $\pi /2$, and (d) $3\pi /4$. $\beta = 0$ and values of c are (e) 0, (f) 0.3, (g) 0.7, (h) 1, and (i) shows the intensity section distributions of (a2)-(d2) along the colorful lines.
Fig. 7.
Fig. 7. Edge enhancement experiment results of a circle with different l. Values of l are (a) 0, (b) 0.4, (c) 0.8, (d) 1, (f) 1, (g) 1.4, (h) 1.8, (i) 2. (e) and (j) are the normalized intensity along the colorful lines in (a)-(d) and (f)-(i).

Equations (18)

Equations on this page are rendered with MathJax. Learn more.

S 1 ( r , θ ) = exp ( i l θ ) c i r c ( r R )
F O e d g e = S 1 F O ( x , y )
s 1 ( ρ , φ ) = i exp ( i φ ) π R / 2 ρ [ J 1 ( x ) H 0 ( x ) J 0 ( x ) H 1 ( x ) ]
I h o l o ( x , y ) = O ( x , y ) T G ( x , y ) + e ( x , y ) = 1 2 O ( x , y ) + 1 4 O ( x , y ) h ( x , y ) + 1 4 O ( x , y ) h ( x , y ) + e ( x , y ) = C + 1 4 U ( x , y ) + 1 4 U ( x , y ) + e ( x , y )
I h o l o = 1 2 F 1 H F O + 1 2 F 1 H F O + e
S 1 F I h o l o = 1 4 S 1 H F O + 1 4 S 1 H F O + E = 1 4 H F O e d g e + 1 4 H F O e d g e  +  E = 1 2 Re { H } F O e d g e + E
O e d g e = 2 F 1 S 1 Re { F I h o l o H } + e
S L G ( r , θ ) = ( r ω 1 ) exp [ ( r ω 1 ) 2 ] c i r c ( r R ) exp ( i l θ ) = A L G exp ( i l θ ) c i r c ( r R )
S 2 = exp ( i l 1 ( θ + β ) ) + c exp ( i l 2 ( θ + β ) )
S 2 = exp ( i l 1 ( θ + β ) ) + c exp ( i l 2 ( θ + β ) ) = 2 cos ( θ + β )
s 2 ( ρ , φ ) = i π R ρ [ J 1 ( x ) H 0 ( x ) J 0 ( x ) H 1 ( x ) ] sin ( φ + β )
S L G 2 = r ω 1 exp [ ( r ω 1 ) 2 ] [ c exp ( i θ ) + exp ( i ( θ + β ) ) ]
S B e s s e l 2 = [ J 2 ( α r ) r ] [ c exp ( i θ ) + exp ( i ( θ + β ) ) ]
S A i r y 2 = A i ( r 0 r ω 0 ) [ c exp ( i θ ) + exp ( i ( θ + β ) ) ]
S S i n c 2 = sin c ( α r ) sin ( α π r ) [ c exp ( i θ ) + exp ( i ( θ + β ) ) ]
T F ( r ) = 1 2 + 1 2 sgn [ cos ( π r 2 r 1 2 ) ]
T G ( r ) = 1 2 + 1 2 cos ( π r 2 r 1 2 )
C = I ¯ max I ¯ min I ¯ max + I ¯ min
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.