control
Latest
Scaling Up Bioimaging with Microfluidic Chips
Explore how microfluidic chips can enhance your imaging experiments by increasing control, throughput, or flexibility. In this remote, personalized workshop, participants will receive expert guidance, support and chips to run tests on their own microscopes.
Open SPM: A Modular Framework for Scanning Probe Microscopy
OpenSPM aims to democratize innovation in the field of scanning probe microscopy (SPM), which is currently dominated by a few proprietary, closed systems that limit user-driven development. Our platform includes a high-speed OpenAFM head and base optimized for small cantilevers, an OpenAFM controller, a high-voltage amplifier, and interfaces compatible with several commercial AFM systems such as the Bruker Multimode, Nanosurf DriveAFM, Witec Alpha SNOM, Zeiss FIB-SEM XB550, and Nenovision Litescope. We have created a fully documented and community-driven OpenSPM platform, with training resources and sourcing information, which has already enabled the construction of more than 15 systems outside our lab. The controller is integrated with open-source tools like Gwyddion, HDF5, and Pycroscopy. We have also engaged external companies, two of which are integrating our controller into their products or interfaces. We see growing interest in applying parts of the OpenSPM platform to related techniques such as correlated microscopy, nanoindentation, and scanning electron/confocal microscopy. To support this, we are developing more generic and modular software, alongside a structured development workflow. A key feature of the OpenSPM system is its Python-based API, which makes the platform fully scriptable and ideal for AI and machine learning applications. This enables, for instance, automatic control and optimization of PID parameters, setpoints, and experiment workflows. With a growing contributor base and industry involvement, OpenSPM is well positioned to become a global, open platform for next-generation SPM innovation.
“Open Raman Microscopy (ORM): A modular Raman spectroscopy setup with an open-source controller”
Raman spectroscopy is a powerful technique for identifying chemical species by probing their vibrational energy levels, offering exceptional specificity with a relatively simple setup involving a laser source, spectrometer, and microscope/probe. However, the high cost of Raman systems lacking modularity often limits exploratory research hindering broader adoption. To address the need for an affordable, modular microscopy platform for multimodal imaging, we present a customizable confocal Raman spectroscopy setup alongside an open-source acquisition software, ORM (Open Raman Microscopy) Controller, developed in Python. This solution bridges the gap between expensive commercial systems and complex, custom-built setups used by specialist research groups. In this presentation, we will cover the components of the setup, the design rationale, assembly methods, limitations, and its modular potential for expanding functionality. Additionally, we will demonstrate ORM’s capabilities for instrument control, 2D and 3D Raman mapping, region-of-interest selection, and its adaptability to various instrument configurations. We will conclude by showcasing practical applications of this setup across different research fields.
Optogenetic control of Nodal signaling patterns
Embryos issue instructions to their cells in the form of patterns of signaling activity. Within these patterns, the distribution of signaling in time and space directs the fate of embryonic cells. Tools to perturb developmental signaling with high resolution in space and time can help reveal how these patterns are decoded to make appropriate fate decisions. In this talk, I will present new optogenetic reagents and an experimental pipeline for creating designer Nodal signaling patterns in live zebrafish embryos. Our improved optoNodal reagents eliminate dark activity and improve response kinetics, without sacrificing dynamic range. We adapted an ultra-widefield microscopy platform for parallel light patterning in up to 36 embryos and demonstrated precise spatial control over Nodal signaling activity and downstream gene expression. Using this system, we demonstrate that patterned Nodal activation can initiate specification and internalization movements of endodermal precursors. Further, we used patterned illumination to generate synthetic signaling patterns in Nodal signaling mutants, rescuing several characteristic developmental defects. This study establishes an experimental toolkit for systematic exploration of Nodal signaling patterns in live embryos.
A Flexible Platform for Monitoring Cerebellum-Dependent Sensory Associative Learning
Climbing fiber inputs to Purkinje cells provide instructive signals critical for cerebellum-dependent associative learning. Studying these signals in head-fixed mice facilitates the use of imaging, electrophysiological, and optogenetic methods. Here, a low-cost behavioral platform (~$1000) was developed that allows tracking of associative learning in head-fixed mice that locomote freely on a running wheel. The platform incorporates two common associative learning paradigms: eyeblink conditioning and delayed tactile startle conditioning. Behavior is tracked using a camera and the wheel movement by a detector. We describe the components and setup and provide a detailed protocol for training and data analysis. This platform allows the incorporation of optogenetic stimulation and fluorescence imaging. The design allows a single host computer to control multiple platforms for training multiple animals simultaneously.
PiSpy: An Affordable, Accessible, and Flexible Imaging Platform for the Automated Observation of Organismal Biology and Behavior
A great deal of understanding can be gleaned from direct observation of organismal growth, development, and behavior. However, direct observation can be time consuming and influence the organism through unintentional stimuli. Additionally, video capturing equipment can often be prohibitively expensive, difficult to modify to one’s specific needs, and may come with unnecessary features. Here, we describe the PiSpy, a low-cost, automated video acquisition platform that uses a Raspberry Pi computer and camera to record video or images at specified time intervals or when externally triggered. All settings and controls, such as programmable light cycling, are accessible to users with no programming experience through an easy-to-use graphical user interface. Importantly, the entire PiSpy system can be assembled for less than $100 using laser-cut and 3D-printed components. We demonstrate the broad applications and flexibility of the PiSpy across a range of model and non-model organisms. Designs, instructions, and code can be accessed through an online repository, where a global community of PiSpy users can also contribute their own unique customizations and help grow the community of open-source research solutions.
Building a Simple and Versatile Illumination System for Optogenetic Experiments
Controlling biological processes using light has increased the accuracy and speed with which researchers can manipulate many biological processes. Optical control allows for an unprecedented ability to dissect function and holds the potential for enabling novel genetic therapies. However, optogenetic experiments require adequate light sources with spatial, temporal, or intensity control, often a bottleneck for researchers. Here we detail how to build a low-cost and versatile LED illumination system that is easily customizable for different available optogenetic tools. This system is configurable for manual or computer control with adjustable LED intensity. We provide an illustrated step-by-step guide for building the circuit, making it computer-controlled, and constructing the LEDs. To facilitate the assembly of this device, we also discuss some basic soldering techniques and explain the circuitry used to control the LEDs. Using our open-source user interface, users can automate precise timing and pulsing of light on a personal computer (PC) or an inexpensive tablet. This automation makes the system useful for experiments that use LEDs to control genes, signaling pathways, and other cellular activities that span large time scales. For this protocol, no prior expertise in electronics is required to build all the parts needed or to use the illumination system to perform optogenetic experiments.
Creating and controlling visual environments using BonVision
Real-time rendering of closed-loop visual environments is important for next-generation understanding of brain function and behaviour, but is often prohibitively difficult for non-experts to implement and is limited to few laboratories worldwide. We developed BonVision as an easy-to-use open-source software for the display of virtual or augmented reality, as well as standard visual stimuli. BonVision has been tested on humans and mice, and is capable of supporting new experimental designs in other animal models of vision. As the architecture is based on the open-source Bonsai graphical programming language, BonVision benefits from native integration with experimental hardware. BonVision therefore enables easy implementation of closed-loop experiments, including real-time interaction with deep neural networks, and communication with behavioural and physiological measurement and manipulation devices.
OpenFlexure
OpenFlexure is a 3D printed flexure translation stage, developed by a group at the Bath University. The stage is capable of sub-micron-scale motion, with very small drift over time. Which makes it quite good, among other things, for time-lapse protocols that need to be done over days/weeks time, and under space restricted areas, such as fume hoods.
Feeding Exprementation Device ver3 (FED3)
FED3 is a device for behavioral training of mice in vivarium home-cages. Mice interact with FED3 through two nose-pokes and FED3 responds with visual stimuli, auditory stimuli, and by dispensing pellets. As it is used in the home-cage FED3 can be used for around-the-clock training of mice over several weeks. FED3 is open-source and can be built by users for ~10-20x less than commercial solutions for training mice. The control code is also open-source and was designed to be easily modified by users.
control coverage
10 items