Lukas Braun: Learning Neural Computations in Spiking Neural Networks Using Gradient-Based Optimisation

BCCN Berlin / Technische Universität Berlin

 

Abstract

The computation performed by a neuron depends on its synaptic connectivity and its intrinsic electrophysiological properties. Synaptic connectivity determines how information from presynaptic neurons is integrated spatially, while cellular properties determine how information is integrated over time. Unlike their biological counterparts, most computational approaches to learning in spiking neural networks are limited to effects of alterations in synaptic connectivity. Here, we use sparse feedback signals that indicate target spike times and gradient-based parameter updates to show that the parameters that determine the temporal receptive field of leaky integrate-and-fire neurons and leaky resonate-and-fire neurons can be learned along with their synaptic connectivity. We adopt the teacher-student paradigm to show that the fixed parameter setting of a teacher neuron can be recovered by a randomly initialised student neuron when trained with temporal stochastic online gradient descent and without backpropagation through time. We further investigate the effects of temporal noise in the feedback signal and the capability of our approach to fit temporally evolving target computations. Our results are a step towards online learning of complex temporal neural computations from ungraded and unsigned sparse feedback signals that indicate target spike times with a biologically inspired learning mechanism.

 

Additional Information

Master Thesis Defense

 

Organized by

Prof. Dr. Tim Vogels & Prof. Dr. Henning Sprekeler  / Lisa Velenosi

Location: The talk will take place digitally via ZOOM - please send an email to graduateprograms@bccn-berlin.de for access

Go back