Temporal Order
temporal order
Rodents to Investigate the Neural Basis of Audiovisual Temporal Processing and Perception
To form a coherent perception of the world around us, we are constantly processing and integrating sensory information from multiple modalities. In fact, when auditory and visual stimuli occur within ~100 ms of each other, individuals tend to perceive the stimuli as a single event, even though they occurred separately. In recent years, our lab, and others, have developed rat models of audiovisual temporal perception using behavioural tasks such as temporal order judgments (TOJs) and synchrony judgments (SJs). While these rodent models demonstrate metrics that are consistent with humans (e.g., perceived simultaneity, temporal acuity), we have sought to confirm whether rodents demonstrate the hallmarks of audiovisual temporal perception, such as predictable shifts in their perception based on experience and sensitivity to alterations in neurochemistry. Ultimately, our findings indicate that rats serve as an excellent model to study the neural mechanisms underlying audiovisual temporal perception, which to date remains relativity unknown. Using our validated translational audiovisual behavioural tasks, in combination with optogenetics, neuropharmacology and in vivo electrophysiology, we aim to uncover the mechanisms by which inhibitory neurotransmission and top-down circuits finely control ones’ perception. This research will significantly advance our understanding of the neuronal circuitry underlying audiovisual temporal perception, and will be the first to establish the role of interneurons in regulating the synchronized neural activity that is thought to contribute to the precise binding of audiovisual stimuli.
Associative memory of structured knowledge
A long standing challenge in biological and artificial intelligence is to understand how new knowledge can be constructed from known building blocks in a way that is amenable for computation by neuronal circuits. Here we focus on the task of storage and recall of structured knowledge in long-term memory. Specifically, we ask how recurrent neuronal networks can store and retrieve multiple knowledge structures. We model each structure as a set of binary relations between events and attributes (attributes may represent e.g., temporal order, spatial location, role in semantic structure), and map each structure to a distributed neuronal activity pattern using a vector symbolic architecture (VSA) scheme. We then use associative memory plasticity rules to store the binarized patterns as fixed points in a recurrent network. By a combination of signal-to-noise analysis and numerical simulations, we demonstrate that our model allows for efficient storage of these knowledge structures, such that the memorized structures as well as their individual building blocks (e.g., events and attributes) can be subsequently retrieved from partial retrieving cues. We show that long-term memory of structured knowledge relies on a new principle of computation beyond the memory basins. Finally, we show that our model can be extended to store sequences of memories as single attractors.
Sensory and metasensory responses during sequence learning in the mouse somatosensory cortex
Sequential temporal ordering and patterning are key features of natural signals, used by the brain to decode stimuli and perceive them as sensory objects. Touch is one sensory modality where temporal patterning carries key information, and the rodent whisker system is a prominent model for understanding neuronal coding and plasticity underlying touch sensation. Neurons in this system are precise encoders of fluctuations in whisker dynamics down to a timescale of milliseconds, but it is not clear whether they can refine their encoding abilities as a result of learning patterned stimuli. For example, can they enhance temporal integration to become better at distinguishing sequences? To explore how cortical coding plasticity underpins sequence discrimination, we developed a task in which mice distinguished between tactile ‘word’ sequences constructed from distinct vibrations delivered to the whiskers, assembled in different orders. Animals licked to report the presence of the target sequence. Optogenetic inactivation showed that the somatosensory cortex was necessary for sequence discrimination. Two-photon imaging in layer 2/3 of the primary somatosensory “barrel” cortex (S1bf) revealed that, in well-trained animals, neurons had heterogeneous selectivity to multiple task variables including not just sensory input but also the animal’s action decision and the trial outcome (presence or absence of the predicted reward). Many neurons were activated preceding goal-directed licking, thus reflecting the animal’s learnt action in response to the target sequence; these neurons were found as soon as mice learned to associate the rewarded sequence with licking. In contrast, learning evoked smaller changes in sensory response tuning: neurons responding to stimulus features were already found in naïve mice, and training did not generate neurons with enhanced temporal integration or categorical responses. Therefore, in S1bf sequence learning results in neurons whose activity reflects the learnt association between target sequence and licking, rather than a refined representation of sensory features. Taken together with results from other laboratories, our findings suggest that neurons in sensory cortex are involved in task-specific processing and that an animal does not sense the world independently of what it needs to feel in order to guide behaviour.
Time perception: how our judgment of time is influenced by the regularity and change in stimulus distribution?
To organize various experiences in a coherent mental representation, we need to properly estimate the duration and temporal order of different events. Yet, our perception of time is noisy and vulnerable to various illusions. Studying these illusions can elucidate the mechanism by which the brain perceives time. In this talk, I will review a few studies on how the brain perceives duration of events and the temporal order between self-generated motion and sensory feedback. Combined with computational models at different levels, these experiments illustrated that the brain incorporates the prior knowledge of the statistical distribution of the duration of stimuli and the decay of memory when estimating duration of an individual event, and adjusts its perception of temporal order to changes in the statistics of the environment.
Phase dependent maintenance of temporal order in biological and artificial recurrent neural networks
COSYNE 2022
Phase dependent maintenance of temporal order in biological and artificial recurrent neural networks
COSYNE 2022