John Rinzel: Neuronal models for rhythmic timing

New York University

When listening to music, we typically lock onto and move to a beat (1-6 Hz). Behavioral studies on such synchronization (Repp 2005) abound, yet the neural mechanisms remain poorly understood. In contrast to entrainment-based models, our formulation focuses on rhythmic event time estimation and plasticity: a neuronal beat generator that adapts its controllable frequency and phase to match the external rhythm. When the stimulus is removed the beat generator continues to produce the learned rhythm in accordance with a synchronization continuation task (SCT). Further, we trained a Recurrent Neural Network (RNN) of 500 E-I units to perform the SCT. We find in analyzing the RNN three “neuronal” subgroups with one E-I pair of subgroups, tightly synchronized together, that fire around the beat times and another I-subgroup with interbeat activity. We’ve developed a reduced three-variable model that recapitulates the RNN’s behavior.

 

Guests are welcome! if you want to join online please contact Stefano (stefano.masserini (at) hu-berlin.de).

 

Organized by

Stefano Masserini &/ Margret Franke / Darko Komnenic



Location: Hybrid event: BCCN Berlin Lecture Hall, Philippstr. 13, Haus 6, 10115 Berlin and online

Go back