Gregory Knoll: Encoding and Information Transmission in Synaptically Coupled Neuronal Populations

HU Berlin / BCCN Berlin

Abstract

In this thesis I attempt to better understand the neural code, or the way in which the nerve cells of the brain transmit and process information in their activity, through the investigation of stimulus encoding in neural systems. To this end, I analyze changes in the dynamics of standard neuronal models, developed in the framework of statistical physics, to variations in parameters and connectivity in the presence versus the absence of a stimulus. In conjunction, information theoretical measures are utilized to quantify the ability of neuronal populations to transmit received information through their output. The presented results build upon a multitude of previous studies of both unconnected and recurrent neural populations. Some of these studies highlight two neural code candidates that have distinct information filtering profiles: an integration code that acts as a low-pass information filter and a synchrony code that acts as a bandpass filter. In the following, synaptic connectivity is added in diverse ways in order to extend results of these studies to networks with a higher level of connectivity, as observed in the cortex.

In the first part, I compare a synchrony code with coincidence detection by a postsynaptic cell and both are shown to be similarly parameterized. A theory is developed for the spectral measures and information encoding of an entire two-stage system consisting of an unconnected encoding population and a postsynaptic cell that receives the population’s output. The theory reproduces the bandpass information filtering observed in neurons that are sensitive to coincident spiking events. Information transmission at low frequencies, on the other hand, is shown to strongly depend on the uniformity in the synaptic weights between the stages. If all weights are equal, the high fidelity in low-frequency information transmission expected of an integrator cell can be reproduced. In contrast, randomness and heterogeneity in the synapses results in an effective white noise that causes a loss in information, particularly at low frequencies. In the second and third parts, I investigate the information transmission in a locally recurrent network with sparse and random connectivity.

In the second part, the effects of the recurrent input on the network’s stationary dynamics are examined. I test the validity of approximating the stationary components of the recurrent network input in the presence of a common stimulus with Gaussian white noise, which holds if the stimulus modulates the current as opposed to the noise intensity. Further pursuing the conception of the recurrence as a white noise source, I demonstrate a type of suprathreshold stochastic resonance in the ensemble encoding as the synaptic connectivity is increased. Another benefit of synaptic noise is demonstrated in bistable neural models: it destabilizes the states and increases the transition rates between them. An increase in the transition rates where the two states are equiprobable reduces the exorbitantly large variance in the spike count and improves stimulus encoding by making the network more responsive to external input.

In the third and final part, I tackle the temporal characteristics of the network input. The network input is shown to modulate the current and noise intensity and a new approximation for the rate response that includes these modulations is derived. I adapt a linear response approach to more accurately capture the cross-correlations among the neurons in the network. This correlation approximation, along with the updated rate response, is used to give a good estimate of the spectral densities: the cross-spectrum between output and stimulus, the power spectrum of the network output, and the coherence function. I use the newly approximated coherence function to show that stronger connectivity in a network with a moderate amount of intrinsic noise is mostly detrimental to information transmission, but may aid in sharpening synchrony encoding. I also show that the balance and stochasticity in the synaptic weights influence the network activity and its relation to the synchrony threshold, thus affecting the peak amplitude in the coherence function; in particular, random weights can increase the peak’s magnitude in comparison to fixed weights. In a similar manner, the relation between the activity and the synchrony threshold can be determined on the postsynaptic side, which selects for the information filter type and how effective it will be.

 

Additional Information

PhD defense

Organized by

Prof. Dr. Benjamin Lindner

 

 

Location: HU Berlin, Newtonstraße 15, Room 2'101 [Adlershof]

Go back