← Back

Behavioural

Topic spotlight
TopicWorld Wide

behavioural neuroscience

Discover seminars, jobs, and research tagged with behavioural neuroscience across World Wide.
5 curated items5 Seminars
Updated 3 days ago
5 items · behavioural neuroscience
5 results
SeminarNeuroscience

High Stakes in the Adolescent Brain: Glia Ignite Under THC’s Influence

Yalin Sun
University of Toronto
Dec 3, 2025
SeminarNeuroscience

An executive control approach to language production

Etienne Koechlin
École Normale Supérieure and INSERM, Paris, France
Apr 4, 2022

Language production is a form of behavior and as such involves executive control and the prefrontal function. The cognitive architecture of prefrontal executive function thus certainly plays an important role in shaping language production. In this talk, I will review the main features of the prefrontal executive function we have uncovered during the last two decades and I will discuss how these features may help understanding language production.

SeminarOpen SourceRecording

Autopilot v0.4.0 - Distributing development of a distributed experimental framework

Jonny Saunders
University of Oregon
Sep 28, 2021

Autopilot is a Python framework for performing complex behavioral neuroscience experiments by coordinating a swarm of Raspberry Pis. It was designed to not only give researchers a tool that allows them to perform the hardware-intensive experiments necessary for the next generation of naturalistic neuroscientific observation, but also to make it easier for scientists to be good stewards of the human knowledge project. Specifically, we designed Autopilot as a framework that lets its users contribute their technical expertise to a cumulative library of hardware interfaces and experimental designs, and produce data that is clean at the time of acquisition to lower barriers to open scientific practices. As autopilot matures, we have been progressively making these aspirations a reality. Currently we are preparing the release of Autopilot v0.4.0, which will include a new plugin system and wiki that makes use of semantic web technology to make a technical and contextual knowledge repository. By combining human readable text and semantic annotations in a wiki that makes contribution as easy as possible, we intend to make a communal knowledge system that gives a mechanism for sharing the contextual technical knowledge that is always excluded from methods sections, but is nonetheless necessary to perform cutting-edge experiments. By integrating it with Autopilot, we hope to make a first of its kind system that allows researchers to fluidly blend technical knowledge and open source hardware designs with the software necessary to use them. Reciprocally, we also hope that this system will support a kind of deep provenance that makes abstract "custom apparatus" statements in methods sections obsolete, allowing the scientific community to losslessly and effortlessly trace a dataset back to the code and hardware designs needed to replicate it. I will describe the basic architecture of Autopilot, recent work on its community contribution ecosystem, and the vision for the future of its development.

SeminarOpen SourceRecording

SimBA for Behavioral Neuroscientists

Sam A. Golden
University of Washington, Department of Biological Structure
Jul 15, 2021

Several excellent computational frameworks exist that enable high-throughput and consistent tracking of freely moving unmarked animals. SimBA introduce and distribute a plug-and play pipeline that enables users to use these pose-estimation approaches in combination with behavioral annotation for the generation of supervised machine-learning behavioral predictive classifiers. SimBA was developed for the analysis of complex social behaviors, but includes the flexibility for users to generate predictive classifiers across other behavioral modalities with minimal effort and no specialized computational background. SimBA has a variety of extended functions for large scale batch video pre-processing, generating descriptive statistics from movement features, and interactive modules for user-defined regions of interest and visualizing classification probabilities and movement patterns.

SeminarNeuroscience

Neural systems for vocal perception

Catherine Perrodin
Institute of Behavioural Neuroscience, University College London
Jan 11, 2021

For social animals, successfully communicating with others is essential for interactions and survival. My research aims to answer a central question on the neuronal basis of this ability, from the perspective of the listener: how do our brains enable us to communicate with each other? My work develops nonhuman animal models to study the behavioural and neuronal mechanisms underlying the perception of vocal patterns. I will start by providing an overview of my past research characterizing the neuronal-level substrates of voice processing along the primate temporal lobe. I will then focus on my current work on vocal perception in mice, in which I utilize natural male-female courtship behaviour to evaluate the acoustic dimensions extracted by listeners from ultrasonic sequences. I will then talk about ongoing work investigating the neuronal substrates supporting the perception of behaviourally relevant acoustic cues from mouse vocal sequences.