## Abstract

Temporal optics is an emerging field in which optical signals are considered similarly to objects in spatial optics. Indeed, temporal magnifications, temporal Fourier transform, and temporal signal processing have been demonstrated by adopting optical schemes from space to time. However, temporal imaging has so far focused on the equivalent of two-dimensional spatial imaging schemes, while ignoring the depth of the input. Here we adopt the concept of three-dimensional (3D) imaging into the time domain, providing a new dimension in temporal optics. We developed the concept of temporal 3D objects and demonstrated temporal depth imaging. First, we define signals with temporal depth as signals where each point in time has a different dispersion value. Next, we demonstrate how to measure these signals with a moving time lens. Finally, we present a time lens array and use it to realize temporal depth imaging with a single measurement. Our temporal depth imaging concept will enable measurements of ultrafast nonperiodic phenomena, such as optical rogue waves and the evolution of ultrafast pulses in fiber lasers, with temporal resolution that was not possible until now.

© 2017 Optical Society of America

## 1. INTRODUCTION

Retrieving the depth information of objects and three-dimensional (3D) imaging systems has resulted in novel imaging devices. Different methods for 3D imaging and recording have been invented, including holography [1], light-field imaging and projection [2], parallax imaging [3], optical coherence tomography [4], and time-of-flight cameras [5]. Today, with the flourishing of 3D displays and virtual reality devices, different methods for 3D recording have been developed and implemented in smartphones and compact cameras [6]. In addition, with the spread of 3D printers, the need for replicating objects is spurring the development of different devices for obtaining depth information with high accuracy [7]. We developed the concept of non-flat temporal signals and adopted the depth imaging approach into the time domain.

The concept of temporal depth will open a new avenue in temporal optics, and temporal depth imaging systems will enable the investigation of the dynamics of ultrafast nonperiodic phenomena, such as optical rogue waves and ultrafast pulses in fiber lasers. Optical rogue waves in optical fibers result from the combined effects of nonlinearity and dispersion [8–10]. Thus, with a temporal depth imaging system, it is possible to investigate the evolution and dynamics of rogue waves in a temporal resolution that was not possible before. Also, the evolution of ultrafast pulses in fiber lasers strongly depends on the dispersion in the cavity [11]. Our temporal depth imaging will enable the investigation of the dynamics and evolution of pulses in the cavity, which may improve future ultrafast fiber lasers.

There is a mathematical equivalent between light diffraction in space and pulse dispersion in time, which arises from the similarity between the equations describing these two phenomena. The diffraction of the field envelope $A(r,t)$ is described by

where $k$ is the propagation parameter. The dispersion of a pulse $A(z,t)$ as it propagates in a material is described by## 2. DEPTH IMAGING IN SPACE AND TIME

Any depth imaging system is based on the ability to distinguish between two object points, which are separated along the $z$ axis. One method to obtain this depth information is by utilizing a lens array so that each lens images the object from a slightly different angle [20,21]. The schematics of this configuration are presented in Fig. 1, which shows that object points with different depths are imaged to different locations on the image plane as a function of the lens position. We denote the image plane distance from the lens array as $v$ and the object plane distance as $u$. The difference between an object point and the object plane is denoted as $\mathrm{\Delta}u$, which creates an image point in $v+\mathrm{\Delta}v$ as presented in Fig. 1, and obeys

where $f$ is the focal distance of the lens. Since we are measuring the output at the plane $u$, the object point is blared and transversely shifted as illustrated in Fig. 1. The shifting of the image point ${x}_{i}$ as a function of lens position ${x}_{l}$ is written as where ${M}_{s}=v/u$ is the imaging magnification. By measuring $\mathrm{\Delta}{x}_{i}$ from each lens in the lens array together with the position of each lens, we can retrieve the depth of each point and reconstruct the full depth information of the object.In temporal optics, free-space propagation is equivalent to dispersion. Therefore, the temporal equivalent for an object which is spread along the $z$ axis is an input signal where each signal point in time acquires a different dispersion value. An illustration of an input signal composed of two pulses, where each pulse acquires a different dispersion value, is presented in Fig. 2. We start with a single pulse, presented in Fig. 2(a), which propagates in a dispersive material and is joined by another pulse in Fig. 2(b). The signal continues to propagate in the dispersive material until it reaches the time lens array in Fig. 2(c). Each time lens imposes a quadratic phase shift on the signal and generates an idler beam. The output idler beam is presented in Fig. 2(d). The idler propagates in additional dispersive material until it reaches the image plane in Fig. 2(e) where three images are formed, each with a different separation between the pulses denoted by ${\tau}_{1}$, ${\tau}_{2}$, and ${\tau}_{3}$. The difference between pulse separations results from the different timing of each time lens and enables us to deduce the dispersion acquired by each pulse. Thus, we designed a temporal depth imaging scheme.

By measuring the delay between the features in the idler wave as a function of the time lens timing, we retrieve the acquired dispersion of each point in the signal. The timing of an idler feature ${\tau}_{i}$ as a function of time lens timing ${\tau}_{l}$ is written as

## 3. MEASURED RESULTS OF TEMPORAL DEPTH IMAGING

To demonstrate this concept, we first measured the change in a temporal image as a function of the timing of the time lens. The experimental setup is presented in Fig. 3. We start with a laser pulse of 70 fs and 85 mW (Toptica FemtoFErb 1560) and split it into a pump wave (1553 nm) and a signal wave (1565 nm). The signal wave is composed of two pulses of 4 ps, where one of the pulses acquires extra dispersion by passing through 80 m of single-mode fiber (SMF), denoted as ${\mathrm{SMF}}_{x}$. The signal is then imaged with a small-aperture time lens that has a temporal magnification of 56 [22]. We measured the separation between the two pulses in the imaged idler wave as a function of the time lens timing using a fast photodetector (Agilent 86116C) connected to a sampling scope (Agilent 86100D). This setup is the temporal equivalent of a spatial lens that shifts transversally for obtaining the depth information of an object.

The measured results of the separation between the two pulses as a function of time lens timing are presented in Fig. 4. We shift the timing of the time lens from $-9\text{\hspace{0.17em}\hspace{0.17em}}\mathrm{ps}$ to $+9\text{\hspace{0.17em}\hspace{0.17em}}\mathrm{ps}$ compared with the signal wave and extract the separation between the idler wave’s pulses. Two representative results of the idler wave as a function of time are presented in the insets. The upper left inset presents the idler wave when the time lens is shifted by $-9\text{\hspace{0.17em}\hspace{0.17em}}\mathrm{ps}$ compared with the input signal wave, showing a separation of 460 ps between the pulses. The lower right inset presents the measured idler wave when the time lens is shifted by $+9\text{\hspace{0.17em}\hspace{0.17em}}\mathrm{ps}$, showing a separation of 620 ps between the pulses. A linear fitted curve is presented as a red solid line with a slope of

We repeated the measurements with different lengths of the extra fiber ${\mathrm{SMF}}_{x}$ to investigate the sensitivity of our method to changes in signal dispersion. We replaced the 80 m SMF denoted as ${\mathrm{SMF}}_{x}$ with SMF lengths ranging from 4 m to 140 m, and measured the separation in the idler wave pulses as a function of the time lens timing. We present these measured results in the inset in Fig. 5 together with fitting curves. Next, we evaluated the derivative of each curve in the inset and obtained the separation derivative as a function of the dispersion difference between the pulses, which is presented in Fig. 5 as blue asterisks. The results reveal a linear increase in the separation derivative, indicating that it is possible to distinguish between signal waves with dispersion differences from 0.2 ps/nm to 2.5 ps/nm. We also calculated the separation derivative without any fitting parameter and presented it as a red curve in Fig. 5, which shows good agreement between the calculated and measured results.

Next, we designed a time lens array scheme for obtaining temporal depth information with a single measurement. In an array of time lenses, each image must be a compressed copy of the signal wave to prevent an overlap of adjacent images, as presented in Fig. 2. Thus, to readout the time lens array output, we resorted to time-to-frequency mapping. First, we redesigned the dispersion of the pump, signal, and idler waves in the time lens into a $2-f$ system, which will map the temporal signal into the frequency domain and measure the output with an optical spectrum analyzer. Specifically, we replaced ${\mathrm{DCF}}_{p}$ with a 160 m dispersion compensating fiber (DCF) and ${\mathrm{DCF}}_{s}$ with an 80 m DCF. This time lens maps two input pulses separated by $\mathrm{\Delta}\tau $ into an idler wave with two spectral peaks separated by $\mathrm{\Delta}\lambda $, according to [23]

We measured the output idler spectrum of two pulses separated by 20 ps imaged with a lens array of two time lenses. First, we measured the spectrum for input pulses with the same dispersion and present the results in Fig. 6(a). Each image presents double pulses separated by $\mathrm{\Delta}{\lambda}_{1}=6.1\text{\hspace{0.17em}\hspace{0.17em}}\mathrm{nm}$ and $\mathrm{\Delta}{\lambda}_{2}=5\text{\hspace{0.17em}\hspace{0.17em}}\mathrm{nm}$, which correspond to a time separation of 20 ps for both images according to Eq. (7) [18,25–27]. The difference between $\mathrm{\Delta}{\lambda}_{1}$ and $\mathrm{\Delta}{\lambda}_{2}$ arises from the difference in the central frequency of the two time lenses, namely, the term ${({\lambda}_{i}/{\lambda}_{s})}^{2}$ in Eq. (7). The second signal is composed of two pulses separated by 20 ps, but one pulse passes through 20 m of SMF more than the other, which induces an extra dispersion of 0.36 ps/nm. The separation in the first image is 5.7 nm, corresponding to a temporal separation of 22 ps; the separation in the second image is 3 nm, which corresponds to a temporal separation of 11 ps. The timing separation of the pumps is 5 ps; hence, the separation derivative is 2 ps/ps, which corresponds to a dispersion difference of 0.4 ps/nm and agrees with the length of ${\mathrm{SMF}}_{x}$. Thus, we demonstrated that our time lens array can retrieve different dispersion of each point in the input signal by comparing the temporal images from the array. Another option to readout the output of the time lens array is to resort to a second stage of a time lens, which will magnify the output of the time lens array. The benefit of a two-stage time lens array over time-to-frequency mapping is that it can also image single-shot signals.

We considered two types of resolution in our system, namely, temporal resolution and depth resolution. Temporal resolution denotes the smallest temporal separation, which can be resolved with our time lenses. Depth resolution denotes the smallest dispersion difference between two points, which can be resolved with our time lens array. These two types of resolutions impose conflicting demands on the time lens array. To improve the temporal resolution, we need wider time lenses since the minimum temporal feature $\mathrm{\Delta}{\tau}_{\mathrm{min}}$ as a function of time lens width ${\tau}_{p}$ follows [22]

## 4. CONCLUSIONS

To conclude, we developed the concept of temporal depth imaging. We defined nonflat signals as signals with different dispersion values as a function of time. We demonstrated how shifting the timing of a time lens makes it possible to retrieve the dispersion value of each point in the signal, which is equivalent to a 3D imaging system. Finally, we demonstrated how a time lens array can retrieve these values with a single measurement by comparing the different images obtained with the time lens array.

The concept of temporal depth in general opens a new avenue in temporal optics. Specifically, temporal depth imaging will allow the investigation of ultrafast nonperiodic phenomena in a temporal resolution that has not been possible so far, for example, rogue waves, which require both nonlinearity and dispersion and pulse evolution in a fiber cavity where the dispersion plays an important role.

## REFERENCES

**1. **D. Gabor, “Holography, 1948–1971,” Proc. IEEE **60**, 655–668 (1972). [CrossRef]

**2. **M. Levoy and P. Hanrahan, “Light field rendering,” in *Proceedings of the 23rd Annual Conference on Computer Graphics and Interactive Techniques* (ACM, 1996), pp. 31–42.

**3. **H. Higuchi and J. Hamasaki, “Real-time transmission of 3-D images formed by parallax panoramagrams,” Appl. Opt. **17**, 3895–3902 (1978). [CrossRef]

**4. **D. Huang, E. A. Swanson, C. P. Lin, J. S. Schuman, W. G. Stinson, W. Chang, M. R. Hee, T. Flotte, K. Gregory, C. A. Puliafito, and J. G. Fujimoto, “Optical coherence tomography,” Science **254**, 1178–1181 (1991). [CrossRef]

**5. **T. Oggier, M. Lehmann, R. Kaufmann, M. Schweizer, M. Richter, P. Metzler, G. Lang, F. Lustenberger, and N. Blanc, “An all-solid-state optical range camera for 3D real-time imaging with sub-centimeter depth resolution (SwissRanger),” Proc. SPIE **5249**, 534–545 (2003). [CrossRef]

**6. **G. Sansoni, M. Trebeschi, and F. Docchio, “State-of-the-art and applications of 3D imaging sensors in industry, cultural heritage, medicine, and criminal investigation,” Sensors **9**, 568–601 (2009). [CrossRef]

**7. **B. C. Gross, J. L. Erkal, S. Y. Lockwood, C. Chen, and D. M. Spence, “Evaluation of 3D printing and its potential impact on biotechnology and the chemical sciences,” Anal. Chem. **86**, 3240–3253 (2014). [CrossRef]

**8. **N. Akhmediev, B. Kibler, F. Baronio, M. Belić, W.-P. Zhong, Y. Zhang, W. Chang, J. M. Soto-Crespo, P. Vouzas, P. Grelu, C. Lecaplain, K. Hammani, S. Rica, A. Picozzi, M. Tlidi, K. Panajotov, A. Mussot, A. Bendahmane, P. Szriftgiser, G. Genty, J. Dudley, A. Kudlinski, A. Demircan, U. Morgner, S. Amiraranashvili, C. Bree, G. Steinmeyer, C. Masoller, N. G. R. Broderick, A. F. J. Runge, M. Erkintalo, S. Residori, U. Bortolozzo, F. T. Arecchi, S. Wabnitz, C. G. Tiofack, S. Coulibaly, and M. Taki, “Roadmap on optical rogue waves and extreme events,” J. Opt. **18**, 063001 (2016). [CrossRef]

**9. **M. Närhi, B. Wetzel, C. Billet, S. Toenger, T. Sylvestre, J.-M. Merolla, R. Morandotti, F. Dias, G. Genty, and J. M. Dudley, “Real-time measurements of spontaneous breathers and rogue wave events in optical fibre modulation instability,” Nat. Commun. **7**, 13675 (2016). [CrossRef]

**10. **D. Solli, C. Ropers, P. Koonath, and B. Jalali, “Optical rogue waves,” Nature **450**, 1054–1057 (2007). [CrossRef]

**11. **Y. Du and X. Shu, “Pulse dynamics in all-normal dispersion ultrafast fiber lasers,” J. Opt. Soc. Am. B **34**, 553–558 (2017). [CrossRef]

**12. **B. H. Kolner and M. Nazarathy, “Temporal imaging with a time lens,” Opt. Lett. **14**, 630–632 (1989). [CrossRef]

**13. **B. H. Kolner, “Space-time duality and the theory of temporal imaging,” IEEE J. Quantum Electron. **30**, 1951–1963 (1994). [CrossRef]

**14. **C. Zhang, P. Chui, and K. K. Wong, “Comparison of state-of-art phase modulators and parametric mixers in time-lens applications under different repetition rates,” Appl. Opt. **52**, 8817–8826 (2013). [CrossRef]

**15. **Y. Okawachi, R. Salem, A. R. Johnson, K. Saha, J. S. Levy, M. Lipson, and A. L. Gaeta, “Asynchronous single-shot characterization of high-repetition-rate ultrafast waveforms using a time-lens-based temporal magnifier,” Opt. Lett. **37**, 4892–4894 (2012). [CrossRef]

**16. **C. Zhang, B. Li, and K. K.-Y. Wong, “Ultrafast spectroscopy based on temporal focusing and its applications,” IEEE J. Sel. Top. Quantum Electron. **22**, 295–306 (2016). [CrossRef]

**17. **R. Salem, M. A. Foster, A. C. Turner-Foster, D. F. Geraghty, M. Lipson, and A. L. Gaeta, “High-speed optical sampling using a silicon-chip temporal magnifier,” Opt. Express **17**, 4324–4329 (2009). [CrossRef]

**18. **J. Azana, N. K. Berger, B. Levit, and B. Fischer, “Spectral Fraunhofer regime: time-to-frequency conversion by the action of a single time lens on an optical pulse,” Appl. Opt. **43**, 483–490 (2004). [CrossRef]

**19. **M. Fridman, A. Farsi, Y. Okawachi, and A. L. Gaeta, “Demonstration of temporal cloaking,” Nature **481**, 62–65 (2012). [CrossRef]

**20. **X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications [invited],” Appl. Opt. **52**, 546–560 (2013). [CrossRef]

**21. **J.-H. Park, K. Hong, and B. Lee, “Recent progress in three-dimensional information processing based on integral imaging,” Appl. Opt. **48**, H77–H94 (2009). [CrossRef]

**22. **T. Yaron, A. Klein, H. Duadi, and M. Fridman, “Temporal superresolution based on a localization microscopy algorithm,” Appl. Opt. **56**, D24–D28 (2017). [CrossRef]

**23. **J. Schröder, F. Wang, A. Clarke, E. Ryckeboer, M. Pelusi, M. A. Roelens, and B. J. Eggleton, “Aberration-free ultra-fast optical oscilloscope using a four-wave mixing based time-lens,” Opt. Commun. **283**, 2611–2614 (2010). [CrossRef]

**24. **M. Fridman, Y. Okawachi, S. Clemmen, M. Ménard, M. Lipson, and A. L. Gaeta, “Waveguide-based single-shot temporal cross-correlator,” J. Opt. **17**, 035501 (2015). [CrossRef]

**25. **T. T. Ng, F. Parmigiani, M. Ibsen, Z. Zhang, P. Petropoulos, and D. J. Richardson, “Compensation of linear distortions by using XPM with parabolic pulses as a time lens,” IEEE Photon. Technol. Lett. **20**, 1097–1099 (2008). [CrossRef]

**26. **K. G. Petrillo and M. A. Foster, “Scalable ultrahigh-speed optical transmultiplexer using a time lens,” Opt. Express **19**, 14051–14059 (2011). [CrossRef]

**27. **Z. Wu, L. Lei, J. Dong, J. Hou, and X. Zhang, “Reconfigurable temporal Fourier transformation and temporal imaging,” J. Lightwave Technol. **32**, 3963–3968 (2014).