← Back

Data Structure

Topic spotlight
TopicWorld Wide

data structure

Discover seminars, jobs, and research tagged with data structure across World Wide.
4 curated items4 Seminars
Updated almost 3 years ago
4 items · data structure
4 results
SeminarNeuroscienceRecording

Understanding Machine Learning via Exactly Solvable Statistical Physics Models

Lenka Zdeborová
EPFL
Feb 7, 2023

The affinity between statistical physics and machine learning has a long history. I will describe the main lines of this long-lasting friendship in the context of current theoretical challenges and open questions about deep learning. Theoretical physics often proceeds in terms of solvable synthetic models, I will describe the related line of work on solvable models of simple feed-forward neural networks. I will highlight a path forward to capture the subtle interplay between the structure of the data, the architecture of the network, and the optimization algorithms commonly used for learning.

SeminarNeuroscienceRecording

Reproducible EEG from raw data to publication figures

Cyril Pernet
University of Edinburgh, UK
Jan 6, 2021

In this talk I will present recent developments in data sharing, organization, and analyses that allow to build fully reproducible workflows. First, I will present the Brain Imaging Data structure and discuss how this allows to build workflows, showing some new tools to read/import/create studies from EEG data structured that way. Second, I will present several newly developed tools for reproducible pre-processing and statistical analyses. Although it does take some extra effort, I will argue that it largely feasible to make most EEG data analysis fully reproducible.

SeminarNeuroscienceRecording

Understanding machine learning via exactly solvable statistical physics models

Lenka Zdeborová
CNRS & CEA Saclay
Jun 23, 2020

The affinity between statistical physics and machine learning has long history, this is reflected even in the machine learning terminology that is in part adopted from physics. I will describe the main lines of this long-lasting friendship in the context of current theoretical challenges and open questions about deep learning. Theoretical physics often proceeds in terms of solvable synthetic models, I will describe the related line of work on solvable models of simple feed-forward neural networks. I will highlight a path forward to capture the subtle interplay between the structure of the data, the architecture of the network, and the learning algorithm.