Data Structure
data structure
Understanding Machine Learning via Exactly Solvable Statistical Physics Models
The affinity between statistical physics and machine learning has a long history. I will describe the main lines of this long-lasting friendship in the context of current theoretical challenges and open questions about deep learning. Theoretical physics often proceeds in terms of solvable synthetic models, I will describe the related line of work on solvable models of simple feed-forward neural networks. I will highlight a path forward to capture the subtle interplay between the structure of the data, the architecture of the network, and the optimization algorithms commonly used for learning.
Toward an open science ecosystem for neuroimaging
It is now widely accepted that openness and transparency are keys to improving the reproducibility of scientific research, but many challenges remain to adoption of these practices. I will discuss the growth of an ecosystem for open science within the field of neuroimaging, focusing on platforms for open data sharing and open source tools for reproducible data analysis. I will also discuss the role of the Brain Imaging Data Structure (BIDS), a community standard for data organization, in enabling this open science ecosystem, and will outline the scientific impacts of these resources.
Reproducible EEG from raw data to publication figures
In this talk I will present recent developments in data sharing, organization, and analyses that allow to build fully reproducible workflows. First, I will present the Brain Imaging Data structure and discuss how this allows to build workflows, showing some new tools to read/import/create studies from EEG data structured that way. Second, I will present several newly developed tools for reproducible pre-processing and statistical analyses. Although it does take some extra effort, I will argue that it largely feasible to make most EEG data analysis fully reproducible.
Understanding machine learning via exactly solvable statistical physics models
The affinity between statistical physics and machine learning has long history, this is reflected even in the machine learning terminology that is in part adopted from physics. I will describe the main lines of this long-lasting friendship in the context of current theoretical challenges and open questions about deep learning. Theoretical physics often proceeds in terms of solvable synthetic models, I will describe the related line of work on solvable models of simple feed-forward neural networks. I will highlight a path forward to capture the subtle interplay between the structure of the data, the architecture of the network, and the learning algorithm.