World Wide relies on analytics signals to operate securely and keep research services available. Accept to continue, or leave the site.
Review the Privacy Policy for details about analytics processing.
Prof
New York University
Showing your local timezone
Schedule
Wednesday, April 21, 2021
1:00 AM America/New_York
Recording provided by the organiser.
Domain
Original Event
View sourceHost
van Vreeswijk TNS
Duration
70 minutes
When listening to music, we typically lock onto and move to a beat (1-6 Hz). Behavioral studies on such synchronization (Repp 2005) abound, yet the neural mechanisms remain poorly understood. Some models hypothesize an array of self-sustaining entrainable neural oscillators that resonate when forced with rhythmic stimuli (Large et al. 2010). In contrast, our formulation focuses on event time estimation and plasticity: a neuronal beat generator that adapts its intrinsic frequency and phase to match the extermal rhythm. The model quickly learns new rhythms, within a few cycles as found in human behavior. When the stimulus is removed the beat generator continues to produce the learned rhythm in accordance with a synchronization continuation task.
John Rinzel
Prof
New York University
Contact & Resources