Latest

SeminarOpen Source

The SIMple microscope: Development of a fibre-based platform for accessible SIM imaging in unconventional environments

Rebecca McClelland
PhD student at the University of Cambridge, United Kingdom.
Aug 26, 2025

Advancements in imaging speed, depth and resolution have made structured illumination microscopy (SIM) an increasingly powerful optical sectioning (OS) and super-resolution (SR) technique, but these developments remain inaccessible to many life science researchers due to the cost, optical complexity and delicacy of these instruments. We address these limitations by redesigning the optical path using in-line fibre components that are compact, lightweight and easily assembled in a “Plug & Play” modality, without compromising imaging performance. They can be integrated into an existing widefield microscope with a minimum of optical components and alignment, making OS-SIM more accessible to researchers with less optics experience. We also demonstrate a complete SR-SIM imaging system with dimensions 300 mm × 300 mm × 450 mm. We propose to enable accessible SIM imaging by utilising its compact, lightweight and robust design to transport it where it is needed, and image in “unconventional” environments where factors such as temperature and biosafety considerations currently limit imaging experiments.

SeminarOpen Source

Open SPM: A Modular Framework for Scanning Probe Microscopy

Marcos Penedo Garcia
Senior scientist, LBNI-IBI, EPFL Lausanne, Switzerland
Jun 24, 2025

OpenSPM aims to democratize innovation in the field of scanning probe microscopy (SPM), which is currently dominated by a few proprietary, closed systems that limit user-driven development. Our platform includes a high-speed OpenAFM head and base optimized for small cantilevers, an OpenAFM controller, a high-voltage amplifier, and interfaces compatible with several commercial AFM systems such as the Bruker Multimode, Nanosurf DriveAFM, Witec Alpha SNOM, Zeiss FIB-SEM XB550, and Nenovision Litescope. We have created a fully documented and community-driven OpenSPM platform, with training resources and sourcing information, which has already enabled the construction of more than 15 systems outside our lab. The controller is integrated with open-source tools like Gwyddion, HDF5, and Pycroscopy. We have also engaged external companies, two of which are integrating our controller into their products or interfaces. We see growing interest in applying parts of the OpenSPM platform to related techniques such as correlated microscopy, nanoindentation, and scanning electron/confocal microscopy. To support this, we are developing more generic and modular software, alongside a structured development workflow. A key feature of the OpenSPM system is its Python-based API, which makes the platform fully scriptable and ideal for AI and machine learning applications. This enables, for instance, automatic control and optimization of PID parameters, setpoints, and experiment workflows. With a growing contributor base and industry involvement, OpenSPM is well positioned to become a global, open platform for next-generation SPM innovation.

SeminarOpen Source

Optogenetic control of Nodal signaling patterns

Nathan Lord
Assistant Professor, Department of Computational and Systems Biology
Sep 20, 2024

Embryos issue instructions to their cells in the form of patterns of signaling activity. Within these patterns, the distribution of signaling in time and space directs the fate of embryonic cells. Tools to perturb developmental signaling with high resolution in space and time can help reveal how these patterns are decoded to make appropriate fate decisions. In this talk, I will present new optogenetic reagents and an experimental pipeline for creating designer Nodal signaling patterns in live zebrafish embryos. Our improved optoNodal reagents eliminate dark activity and improve response kinetics, without sacrificing dynamic range. We adapted an ultra-widefield microscopy platform for parallel light patterning in up to 36 embryos and demonstrated precise spatial control over Nodal signaling activity and downstream gene expression. Using this system, we demonstrate that patterned Nodal activation can initiate specification and internalization movements of endodermal precursors. Further, we used patterned illumination to generate synthetic signaling patterns in Nodal signaling mutants, rescuing several characteristic developmental defects. This study establishes an experimental toolkit for systematic exploration of Nodal signaling patterns in live embryos.

SeminarOpen Source

Open source FPGA tools for building research devices

Edmund Humenberger
CEO @ Symbiotic EDA
Jun 25, 2024

Edmund will present why to use FPGAs when building scientific instruments, when and why to use open source FPGA tools, the history of their development, their development status, currently supported FPGA families and functions, current developments in design languages and tools, the community, freely available design blocks, and possible future developments.

SeminarOpen SourceRecording

Development of an open-source femtosecond fiber laser system for multiphoton microscopy

Bryan Spring
Northeastern University
Apr 19, 2023

This talk will present a low-cost protocol for fabricating an easily constructed femtosecond (fs) fiber laser system suitable for routine multiphoton microscopy (1060–1080 nm, 1 W average power, 70 fs pulse duration, 30–70 MHz repetition rate). Concepts well-known in the laser physics community essential to proper laser operation, but generally obscure to biophysicists and biomedical engineers, will be clarified. The parts list (~$13K US dollars), the equipment list (~$40K+), and the intellectual investment needed to build the laser will be described. A goal of the presentation will be to engage with the audience to discuss trade-offs associated with a custom-built fs fiber laser versus purchasing a commercial system. I will also touch on my research group’s plans to further develop this custom laser system for multiplexed cancer imaging as well as recent developments in the field that promise even higher performance fs fiber lasers for approximately the same cost and ease of construction.

SeminarOpen SourceRecording

An open-source miniature two-photon microscope for large-scale calcium imaging in freely moving mice

Weijian Zong
Kavli Institute for Systems Neuroscience, Norwegian University of Science and Technology
Sep 12, 2022

Due to the unsuitability of benchtop imaging for tasks that require unrestrained movement, investigators have tried, for almost two decades, to develop miniature 2P microscopes-2P miniscopes–that can be carried on the head of freely moving animals. In this talk, I would first briefly review the development history of this technique, and then report our latest progress on developing the new generation of 2P miniscopes, MINI2P, that overcomes the limits of previous versions by both meeting requirements for fatigue-free exploratory behavior during extended recording periods and satisfying demands for further increasing the cell yield by an order of magnitude, to thousands of neurons. The performance and reliability of MINI2P are validated by recordings of spatially tuned neurons in three brain regions and in three behavioral assays. All information about MINI2P is open access, with instruction videos, code, and manuals on public repositories, and workshops will be organized to help new users getting started. MINI2P permits large-scale and high-resolution calcium imaging in freely-moving mice, and opens the door to investigating brain functions during unconstrained natural behaviors.

SeminarOpen SourceRecording

PiSpy: An Affordable, Accessible, and Flexible Imaging Platform for the Automated Observation of Organismal Biology and Behavior

Gregory Pask and Benjamin Morris
Middlebury College
Apr 20, 2022

A great deal of understanding can be gleaned from direct observation of organismal growth, development, and behavior. However, direct observation can be time consuming and influence the organism through unintentional stimuli. Additionally, video capturing equipment can often be prohibitively expensive, difficult to modify to one’s specific needs, and may come with unnecessary features. Here, we describe the PiSpy, a low-cost, automated video acquisition platform that uses a Raspberry Pi computer and camera to record video or images at specified time intervals or when externally triggered. All settings and controls, such as programmable light cycling, are accessible to users with no programming experience through an easy-to-use graphical user interface. Importantly, the entire PiSpy system can be assembled for less than $100 using laser-cut and 3D-printed components. We demonstrate the broad applications and flexibility of the PiSpy across a range of model and non-model organisms. Designs, instructions, and code can be accessed through an online repository, where a global community of PiSpy users can also contribute their own unique customizations and help grow the community of open-source research solutions.

SeminarOpen SourceRecording

Get more from your ISH brain slices with Stalefish

Seb James
Department of Psychology, The University of Sheffield
Oct 13, 2021

The standard method for staining structures in the brain is to slice the brain into 2D sections. Each slice is treated using a technique such as in-situ hybridization to examine the spatial expression of a particular molecule at a given developmental timepoint. Depending on the brain structures being studied, slices can be made coronally, sagitally, or at any angle that is thought to be optimal for analysis. However, assimilating the information presented in the 2D slice images to gain quantitiative and informative 3D expression patterns is challenging. Even if expression levels are presented as voxels, to give 3D expression clouds, it can be difficult to compare expression across individuals and analysing such data requires significant expertise and imagination. In this talk, I will describe a new approach to examining histology slices, in which the user defines the brain structure of interest by drawing curves around it on each slice in a set and the depth of tissue from which to sample expression. The sampled 'curves' are then assembled into a 3D surface, which can then be transformed onto a common reference frame for comparative analysis. I will show how other neuroscientists can obtain and use the tool, which is called Stalefish, to analyse their own image data with no (or minimal) changes to their slice preparation workflow.

SeminarOpen SourceRecording

Autopilot v0.4.0 - Distributing development of a distributed experimental framework

Jonny Saunders
University of Oregon
Sep 29, 2021

Autopilot is a Python framework for performing complex behavioral neuroscience experiments by coordinating a swarm of Raspberry Pis. It was designed to not only give researchers a tool that allows them to perform the hardware-intensive experiments necessary for the next generation of naturalistic neuroscientific observation, but also to make it easier for scientists to be good stewards of the human knowledge project. Specifically, we designed Autopilot as a framework that lets its users contribute their technical expertise to a cumulative library of hardware interfaces and experimental designs, and produce data that is clean at the time of acquisition to lower barriers to open scientific practices. As autopilot matures, we have been progressively making these aspirations a reality. Currently we are preparing the release of Autopilot v0.4.0, which will include a new plugin system and wiki that makes use of semantic web technology to make a technical and contextual knowledge repository. By combining human readable text and semantic annotations in a wiki that makes contribution as easy as possible, we intend to make a communal knowledge system that gives a mechanism for sharing the contextual technical knowledge that is always excluded from methods sections, but is nonetheless necessary to perform cutting-edge experiments. By integrating it with Autopilot, we hope to make a first of its kind system that allows researchers to fluidly blend technical knowledge and open source hardware designs with the software necessary to use them. Reciprocally, we also hope that this system will support a kind of deep provenance that makes abstract "custom apparatus" statements in methods sections obsolete, allowing the scientific community to losslessly and effortlessly trace a dataset back to the code and hardware designs needed to replicate it. I will describe the basic architecture of Autopilot, recent work on its community contribution ecosystem, and the vision for the future of its development.

SeminarOpen SourceRecording

Open-source tools for systems neuroscience

Jakob Voigts
MIT and Open Ephys
Jun 25, 2021

Open-source tools are gaining an increasing foothold in neuroscience. The rising complexity of experiments in systems neuroscience has led to a need for multiple parts of experiments to work together seamlessly. This means that open-source tools that freely interact with each other and can be understood and modified more easily allow scientists to conduct better experiments with less effort than closed tools. Open Ephys is an organization with team members distributed all around the world. Our mission is to advance our understanding of the brain by promoting community ownership of the tools we use to study it. We are making and distributing cutting edge tools that exploit modern technology to bring down the price and complexity of neuroscience experiments. A large component of this is to take tools that were developed in academic labs and helping with documentation, support, and distribution. More recently, we have been working on bringing high-quality manufacturing, distribution, warranty, and support to open source tools by partnering with OEPS in Portugal. We are now also establishing standards that make it possible to combine methods, such as miniaturized microscopes, electrode drive implants, and silicon probes seamlessly in one system. In the longer term, our development of new tools, interfaces and our standardization efforts have the goal of making it possible for scientists to easily run complex experiments that span from complex behaviors and tasks, multiple recording modalities, to easy access to data processing pipelines.

SeminarOpen SourceRecording

SpikeInterface

Alessio Buccino
ETH Zurich
Jun 11, 2021

Much development has been directed toward improving the performance and automation of spike sorting. This continuous development, while essential, has contributed to an over-saturation of new, incompatible tools that hinders rigorous benchmarking and complicates reproducible analysis. To address these limitations, we developed SpikeInterface, a Python framework designed to unify preexisting spike sorting technologies into a single codebase and to facilitate straightforward comparison and adoption of different approaches. With a few lines of code, researchers can reproducibly run, compare, and benchmark most modern spike sorting algorithms; pre-process, post-process, and visualize extracellular datasets; validate, curate, and export sorting outputs; and more. In this presentation, I will provide an overview of SpikeInterface and, with applications to real and simulated datasets, demonstrate how it can be utilized to reduce the burden of manual curation and to more comprehensively benchmark automated spike sorters.

SeminarOpen SourceRecording

Suite2p: a multipurpose functional segmentation pipeline for cellular imaging

Carsen Stringer
HHMI Janelia Research Campus
May 21, 2021

The combination of two-photon microscopy recordings and powerful calcium-dependent fluorescent sensors enables simultaneous recording of unprecedentedly large populations of neurons. While these sensors have matured over several generations of development, computational methods to process their fluorescence are often inefficient and the results hard to interpret. Here we introduce Suite2p: a fast, accurate, parameter-free and complete pipeline that registers raw movies, detects active and/or inactive cells (using Cellpose), extracts their calcium traces and infers their spike times. Suite2p runs faster than real time on standard workstations and outperforms state-of-the-art methods on newly developed ground-truth benchmarks for motion correction and cell detection.

Development coverage

14 items

Seminar14
Domain spotlight

Explore how Development research is advancing inside Open Source.

Visit domain