← Back

Behaviour

Topic spotlight
TopicOpen Source

behaviour

Discover seminars, jobs, and research tagged with behaviour across Open Source.
13 curated items13 Seminars
Updated about 1 year ago
13 items · behaviour

Latest

13 results
SeminarOpen SourceRecording

Trackoscope: A low-cost, open, autonomous tracking microscope for long-term observations of microscale organisms

Priya Soneji
Georgia Institute of Technology
Oct 8, 2024

Cells and microorganisms are motile, yet the stationary nature of conventional microscopes impedes comprehensive, long-term behavioral and biomechanical analysis. The limitations are twofold: a narrow focus permits high-resolution imaging but sacrifices the broader context of organism behavior, while a wider focus compromises microscopic detail. This trade-off is especially problematic when investigating rapidly motile ciliates, which often have to be confined to small volumes between coverslips affecting their natural behavior. To address this challenge, we introduce Trackoscope, an 2-axis autonomous tracking microscope designed to follow swimming organisms ranging from 10μm to 2mm across a 325 square centimeter area for extended durations—ranging from hours to days—at high resolution. Utilizing Trackoscope, we captured a diverse array of behaviors, from the air-water swimming locomotion of Amoeba to bacterial hunting dynamics in Actinosphaerium, walking gait in Tardigrada, and binary fission in motile Blepharisma. Trackoscope is a cost-effective solution well-suited for diverse settings, from high school labs to resource-constrained research environments. Its capability to capture diverse behaviors in larger, more realistic ecosystems extends our understanding of the physics of living systems. The low-cost, open architecture democratizes scientific discovery, offering a dynamic window into the lives of previously inaccessible small aquatic organisms.

SeminarOpen SourceRecording

An open-source miniature two-photon microscope for large-scale calcium imaging in freely moving mice

Weijian Zong
Kavli Institute for Systems Neuroscience, Norwegian University of Science and Technology
Sep 12, 2022

Due to the unsuitability of benchtop imaging for tasks that require unrestrained movement, investigators have tried, for almost two decades, to develop miniature 2P microscopes-2P miniscopes–that can be carried on the head of freely moving animals. In this talk, I would first briefly review the development history of this technique, and then report our latest progress on developing the new generation of 2P miniscopes, MINI2P, that overcomes the limits of previous versions by both meeting requirements for fatigue-free exploratory behavior during extended recording periods and satisfying demands for further increasing the cell yield by an order of magnitude, to thousands of neurons. The performance and reliability of MINI2P are validated by recordings of spatially tuned neurons in three brain regions and in three behavioral assays. All information about MINI2P is open access, with instruction videos, code, and manuals on public repositories, and workshops will be organized to help new users getting started. MINI2P permits large-scale and high-resolution calcium imaging in freely-moving mice, and opens the door to investigating brain functions during unconstrained natural behaviors.

SeminarOpen SourceRecording

A Flexible Platform for Monitoring Cerebellum-Dependent Sensory Associative Learning

Gerard Joey Broussard
Princeton Neuroscience Institute
Jun 1, 2022

Climbing fiber inputs to Purkinje cells provide instructive signals critical for cerebellum-dependent associative learning. Studying these signals in head-fixed mice facilitates the use of imaging, electrophysiological, and optogenetic methods. Here, a low-cost behavioral platform (~$1000) was developed that allows tracking of associative learning in head-fixed mice that locomote freely on a running wheel. The platform incorporates two common associative learning paradigms: eyeblink conditioning and delayed tactile startle conditioning. Behavior is tracked using a camera and the wheel movement by a detector. We describe the components and setup and provide a detailed protocol for training and data analysis. This platform allows the incorporation of optogenetic stimulation and fluorescence imaging. The design allows a single host computer to control multiple platforms for training multiple animals simultaneously.

SeminarOpen Source

Measuring the Motions of Mice: Open source tracking with the KineMouse Wheel

Jimmy Tabet
Department of Biomedical Engineering UNC/NCSU
May 18, 2022

Who says you can't reinvent the wheel?! This running wheel for head-fixed mice allows 3D reconstruction of body kinematics using a single camera and DeepLabCut (or similar) software. A lightweight, transparent polycarbonate floor and a mirror mounted on the inside allow two views to be captured simultaneously. All parts are commercially available or laser cut

SeminarOpen SourceRecording

PiSpy: An Affordable, Accessible, and Flexible Imaging Platform for the Automated Observation of Organismal Biology and Behavior

Gregory Pask and Benjamin Morris
Middlebury College
Apr 20, 2022

A great deal of understanding can be gleaned from direct observation of organismal growth, development, and behavior. However, direct observation can be time consuming and influence the organism through unintentional stimuli. Additionally, video capturing equipment can often be prohibitively expensive, difficult to modify to one’s specific needs, and may come with unnecessary features. Here, we describe the PiSpy, a low-cost, automated video acquisition platform that uses a Raspberry Pi computer and camera to record video or images at specified time intervals or when externally triggered. All settings and controls, such as programmable light cycling, are accessible to users with no programming experience through an easy-to-use graphical user interface. Importantly, the entire PiSpy system can be assembled for less than $100 using laser-cut and 3D-printed components. We demonstrate the broad applications and flexibility of the PiSpy across a range of model and non-model organisms. Designs, instructions, and code can be accessed through an online repository, where a global community of PiSpy users can also contribute their own unique customizations and help grow the community of open-source research solutions.

SeminarOpen SourceRecording

GuPPy, a Python toolbox for the analysis of fiber photometry data

Talia Lerner
Northwestern University
Nov 24, 2021

Fiber photometry (FP) is an adaptable method for recording in vivo neural activity in freely behaving animals. It has become a popular tool in neuroscience due to its ease of use, low cost, the ability to combine FP with freely moving behavior, among other advantages. However, analysis of FP data can be a challenge for new users, especially those with a limited programming background. Here, we present Guided Photometry Analysis in Python (GuPPy), a free and open-source FP analysis tool. GuPPy is provided as a Jupyter notebook, a well-commented interactive development environment (IDE) designed to operate across platforms. GuPPy presents the user with a set of graphic user interfaces (GUIs) to load data and provide input parameters. Graphs produced by GuPPy can be exported into various image formats for integration into scientific figures. As an open-source tool, GuPPy can be modified by users with knowledge of Python to fit their specific needs.

SeminarOpen SourceRecording

Autopilot v0.4.0 - Distributing development of a distributed experimental framework

Jonny Saunders
University of Oregon
Sep 29, 2021

Autopilot is a Python framework for performing complex behavioral neuroscience experiments by coordinating a swarm of Raspberry Pis. It was designed to not only give researchers a tool that allows them to perform the hardware-intensive experiments necessary for the next generation of naturalistic neuroscientific observation, but also to make it easier for scientists to be good stewards of the human knowledge project. Specifically, we designed Autopilot as a framework that lets its users contribute their technical expertise to a cumulative library of hardware interfaces and experimental designs, and produce data that is clean at the time of acquisition to lower barriers to open scientific practices. As autopilot matures, we have been progressively making these aspirations a reality. Currently we are preparing the release of Autopilot v0.4.0, which will include a new plugin system and wiki that makes use of semantic web technology to make a technical and contextual knowledge repository. By combining human readable text and semantic annotations in a wiki that makes contribution as easy as possible, we intend to make a communal knowledge system that gives a mechanism for sharing the contextual technical knowledge that is always excluded from methods sections, but is nonetheless necessary to perform cutting-edge experiments. By integrating it with Autopilot, we hope to make a first of its kind system that allows researchers to fluidly blend technical knowledge and open source hardware designs with the software necessary to use them. Reciprocally, we also hope that this system will support a kind of deep provenance that makes abstract "custom apparatus" statements in methods sections obsolete, allowing the scientific community to losslessly and effortlessly trace a dataset back to the code and hardware designs needed to replicate it. I will describe the basic architecture of Autopilot, recent work on its community contribution ecosystem, and the vision for the future of its development.

SeminarOpen SourceRecording

Creating and controlling visual environments using BonVision

Aman Saleem
University College London
Sep 15, 2021

Real-time rendering of closed-loop visual environments is important for next-generation understanding of brain function and behaviour, but is often prohibitively difficult for non-experts to implement and is limited to few laboratories worldwide. We developed BonVision as an easy-to-use open-source software for the display of virtual or augmented reality, as well as standard visual stimuli. BonVision has been tested on humans and mice, and is capable of supporting new experimental designs in other animal models of vision. As the architecture is based on the open-source Bonsai graphical programming language, BonVision benefits from native integration with experimental hardware. BonVision therefore enables easy implementation of closed-loop experiments, including real-time interaction with deep neural networks, and communication with behavioural and physiological measurement and manipulation devices.

SeminarOpen SourceRecording

PiVR: An affordable and versatile closed-loop platform to study unrestrained sensorimotor behavior

David Tadres and Matthieu Louis
University of California, Santa Barbara
Sep 3, 2021

PiVR is a system that allows experimenters to immerse small animals into virtual realities. The system tracks the position of the animal and presents light stimulation according to predefined rules, thus creating a virtual landscape in which the animal can behave. By using optogenetics, we have used PiVR to present fruit fly larvae with virtual olfactory realities, adult fruit flies with a virtual gustatory reality and zebrafish larvae with a virtual light gradient. PiVR operates at high temporal resolution (70Hz) with low latencies (<30 milliseconds) while being affordable (<US$500) and easy to build (<6 hours). Through extensive documentation (www.PiVR.org), this tool was designed to be accessible to a wide public, from high school students to professional researchers studying systems neuroscience in academia.

SeminarOpen SourceRecording

SimBA for Behavioral Neuroscientists

Sam A. Golden
University of Washington, Department of Biological Structure
Jul 16, 2021

Several excellent computational frameworks exist that enable high-throughput and consistent tracking of freely moving unmarked animals. SimBA introduce and distribute a plug-and play pipeline that enables users to use these pose-estimation approaches in combination with behavioral annotation for the generation of supervised machine-learning behavioral predictive classifiers. SimBA was developed for the analysis of complex social behaviors, but includes the flexibility for users to generate predictive classifiers across other behavioral modalities with minimal effort and no specialized computational background. SimBA has a variety of extended functions for large scale batch video pre-processing, generating descriptive statistics from movement features, and interactive modules for user-defined regions of interest and visualizing classification probabilities and movement patterns.

SeminarOpen SourceRecording

Feeding Exprementation Device ver3 (FED3)

Lex Kravitz
Washington University
Jun 4, 2021

FED3 is a device for behavioral training of mice in vivarium home-cages. Mice interact with FED3 through two nose-pokes and FED3 responds with visual stimuli, auditory stimuli, and by dispensing pellets. As it is used in the home-cage FED3 can be used for around-the-clock training of mice over several weeks. FED3 is open-source and can be built by users for ~10-20x less than commercial solutions for training mice. The control code is also open-source and was designed to be easily modified by users.

SeminarOpen SourceRecording

BrainGlobe: a Python ecosystem for computational (neuro)anatomy

Adam Tyson
Sainsbury Wellcome Centre, University College London.
May 14, 2021

Neuroscientists routinely perform experiments aimed at recording or manipulating neural activity, uncovering physiological processes underlying brain function or elucidating aspects of brain anatomy. Understanding how the brain generates behaviour ultimately depends on merging the results of these experiments into a unified picture of brain anatomy and function. We present BrainGlobe, a new initiative aimed at developing common Python tools for computational neuroanatomy. These include cellfinder for fast, accurate cell detection in whole-brain microscopy images, brainreg for aligning images to a reference atlas, and brainrender for visualisation of anatomically registered data. These software packages are developed around the BrainGlobe Atlas API. This API provides a common Python interface to download and interact with reference brain atlases from multiple species (including human, mouse and larval zebrafish). This allows software to be developed agnostic to the atlas and species, increasing adoption and interoperability of software tools in neuroscience.

SeminarOpen SourceRecording

DeepLabStream

Jens Schweihoff
Institute of Experimental Epileptology and Cognition Research, University of Bonn
May 7, 2021

DeepLabStream is a python based multi-purpose tool that enables the realtime tracking and manipulation of animals during ongoing experiments. Our toolbox was orginally adapted from the previously published DeepLabCut (Mathis et al., 2018) and expanded on its core capabilities, but is now able to utilize a variety of different network architectures for online pose estimation (SLEAP, DLC-Live, DeepPosekit's StackedDenseNet, StackedHourGlass and LEAP). Our aim is to provide an open-source tool that allows researchers to design custom experiments based on real-time behavior-dependent feedback. My personal ideal goal would be a swiss-army knife like solution where we could integrate the many brilliant python interfaces. We are constantly upgrading DLStream with new features and integrate other open-source solutions.

behaviour coverage

13 items

Seminar13
Domain spotlight

Explore how behaviour research is advancing inside Open Source.

Visit domain