January 2019
Spotlight Summary by Daniel Brunner
All-optical nonlinear activation function for photonic neural networks [Invited]
Neural networks, which leverage the computational power of numerous connected nonlinear elements, are poised to initiate a disruptive evolution in computing. Unfortunately, emulations of such networks using current processors are strongly limited in speed and efficiency. Photonics with its parallelism, superior signal-transmission properties and high bandwidth has long been discussed as an enabling technology for overcoming this bottleneck. Energy efficiency and bandwidth improvements by several orders of magnitude could be in reach. Photonic implementations of network connections and learning schemes, such as in our own recent publication in Optica, and activation functions for photonic neurons dominate current efforts towards hardware implementations. In Optical Materials Express, Miscuglio et al. report novel concepts extending the all-optical activation function toolbox. In their work, they leverage plasmonic effects in coupled nano-structures as well as cross-saturation in multi-level systems. These potentially energy efficient and fast optical nonlinearities are compatible with current integrated photonic technology. Based on a full-scale network simulation, the authors show that such a system can approach state-of-the-art object recognition error rates. The reported results will stimulate novel approaches for optical activation functions, and an experimental implementation would open new avenues towards next-generation photonic neurons.
You must log in to add comments.
Add Comment
You must log in to add comments.
Article Information
All-optical nonlinear activation function for photonic neural networks [Invited]
Mario Miscuglio, Armin Mehrabian, Zibo Hu, Shaimaa I. Azzam, Jonathan George, Alexander V. Kildishev, Matthew Pelton, and Volker J. Sorger
Opt. Mater. Express 8(12) 3851-3863 (2018) View: Abstract | HTML | PDF