Abstract
Synaptic plasticity, i.e. the ability of synaptic connections to strengthen or weaken depending on their input, is a fundamental component of learning and memory in biological neural networks [1]. This property allows the network parameters to directly adapt to the input signal, thus without being externally tuned by a training algorithm. In contrast with this paradigm, the most popular and successful artificial neural network (ANN) models are nowadays based on backpropagation, which usually requires full observability of the network states and precise parameter tuning. In practice, these requirements strongly limit the scalability of neuromorphic hardware and backpropagation is not considered biologically plausible [2].
© 2023 IEEE
PDF ArticleMore Like This
Kilian Müller, Julien Launay, Iacopo Poli, Matthew Filipovich, Alessandro Capelli, Daniel Hesslow, Igor Carron, Laurent Daudet, Florent Krzakala, and Sylvain Gigan
jsiii_3_3 European Quantum Electronics Conference (EQEC) 2023
Ria Talukder, Anas Skalli, and Daniel Brunner
jsiii_5_3 European Quantum Electronics Conference (EQEC) 2023
Elger A. Vlieg, Folkert Horst, Roger Dangel, and Bert J. Offrein
jsiii_4_4 European Quantum Electronics Conference (EQEC) 2023