← Back

Computational Algorithms

Topic spotlight
TopicWorld Wide

computational algorithms

Discover seminars, jobs, and research tagged with computational algorithms across World Wide.
3 curated items3 Seminars
Updated over 3 years ago
3 items · computational algorithms
3 results
SeminarNeuroscienceRecording

Do you hear what I see: Auditory motion processing in blind individuals

Ione Fine
University of Washington
Oct 6, 2021

Perception of object motion is fundamentally multisensory, yet little is known about similarities and differences in the computations that give rise to our experience across senses. Insight can be provided by examining auditory motion processing in early blind individuals. In those who become blind early in life, the ‘visual’ motion area hMT+ responds to auditory motion. Meanwhile, the planum temporale, associated with auditory motion in sighted individuals, shows reduced selectivity for auditory motion, suggesting competition between cortical areas for functional role. According to the metamodal hypothesis of cross-modal plasticity developed by Pascual-Leone, the recruitment of hMT+ is driven by it being a metamodal structure containing “operators that execute a given function or computation regardless of sensory input modality”. Thus, the metamodal hypothesis predicts that the computations underlying auditory motion processing in early blind individuals should be analogous to visual motion processing in sighted individuals - relying on non-separable spatiotemporal filters. Inconsistent with the metamodal hypothesis, evidence suggests that the computational algorithms underlying auditory motion processing in early blind individuals fail to undergo a qualitative shift as a result of cross-modal plasticity. Auditory motion filters, in both blind and sighted subjects, are separable in space and time, suggesting that the recruitment of hMT+ to extract motion information from auditory input includes a significant modification of its normal computational operations.

SeminarNeuroscience

Algorithms and circuits for olfactory navigation in walking Drosophila

Katherine Nagel
New York University
May 5, 2020

Olfactory navigation provides a tractable model for studying the circuit basis of sensori-motor transformations and goal-directed behaviour. Macroscopic organisms typically navigate in odor plumes that provide a noisy and uncertain signal about the location of an odor source. Work in many species has suggested that animals accomplish this task by combining temporal processing of dynamic odor information with an estimate of wind direction. Our lab has been using adult walking Drosophila to understand both the computational algorithms and the neural circuits that support navigation in a plume of attractive food odor. We developed a high-throughput paradigm to study behavioural responses to temporally-controlled odor and wind stimuli. Using this paradigm we found that flies respond to a food odor (apple cider vinegar) with two behaviours: during the odor they run upwind, while after odor loss they perform a local search. A simple computational model based one these two responses is sufficient to replicate many aspects of fly behaviour in a natural turbulent plume. In on-going work, we are seeking to identify the neural circuits and biophysical mechanisms that perform the computations delineated by our model. Using electrophysiology, we have identified mechanosensory neurons that compute wind direction from movements of the two antennae and central mechanosensory neurons that encode wind direction are are involved in generating a stable downwind orientation. Using optogenetic activation, we have traced olfactory circuits capable of evoking upwind orientation and offset search from the periphery, through the mushroom body and lateral horn, to the central complex. Finally, we have used optogenetic activation, in combination with molecular manipulation of specific synapses, to localize temporal computations performed on the odor signal to olfactory transduction and transmission at specific synapses. Our work illustrates how the tools available in fruit fly can be applied to dissect the mechanisms underlying a complex goal-directed behaviour.