← Back

Statistical Learning

Topic spotlight
TopicWorld Wide

statistical learning

Discover seminars, jobs, and research tagged with statistical learning across World Wide.
15 curated items6 Seminars6 ePosters3 Positions
Updated about 23 hours ago
15 items · statistical learning
15 results
Position

Flavia Mancini

Computational and Biological Learning Research Group, Department of Engineering, University of Cambridge
University of Cambridge
Dec 5, 2025

1 Postdoc: Simulating & modelling neural dynamics involved in statistical/aversive learning and homeostatic/pain regulation with the scope to develop new projects. 1 Research Assistant: Conducting behavioral and neuroimaging experiments.

Position

Nicola Torelli

Department of Economics, Business, Mathematics and Statistics (DEAMS), University of Trieste
University of Trieste, Italy
Dec 5, 2025

Research fellowship opportunity in 'Methods and models for artificial intelligence and statistical and automatic learning for the innovation of business processes and decisions' (Metodi e modelli per l'intelligenza artificiale e l'apprendimento statistico e automatico per l'innovazione dei processi e delle decisioni aziendali). The fellowship is for 1 year, with the possibility of extension. The research grant is issued at the Department of Economics, Business, Mathematics and Statistics (DEAMS) of the University of Trieste (Italy), with the scientific coordinator being Prof. Nicola Torelli.

PositionData Science

Tiago de Paula Peixoto

IT:U
Austria
Dec 5, 2025

Call for 5 open-rank positions at IT:U — a new public university just founded in Austria. One of the focuses is Theoretical Foundations of Data Science — engaging with areas such as mathematics of data science, statistical learning, or specific topics like topological data analysis and causality. The concept of “Data Science” here is very broadly defined. The positions are attractive, and include permanent (i.e. recurring) funding for a number of PhD students and post-docs, depending on rank.

SeminarNeuroscience

Learning representations of specifics and generalities over time

Anna Schapiro
University of Pennsylvania
Apr 11, 2024

There is a fundamental tension between storing discrete traces of individual experiences, which allows recall of particular moments in our past without interference, and extracting regularities across these experiences, which supports generalization and prediction in similar situations in the future. One influential proposal for how the brain resolves this tension is that it separates the processes anatomically into Complementary Learning Systems, with the hippocampus rapidly encoding individual episodes and the neocortex slowly extracting regularities over days, months, and years. But this does not explain our ability to learn and generalize from new regularities in our environment quickly, often within minutes. We have put forward a neural network model of the hippocampus that suggests that the hippocampus itself may contain complementary learning systems, with one pathway specializing in the rapid learning of regularities and a separate pathway handling the region’s classic episodic memory functions. This proposal has broad implications for how we learn and represent novel information of specific and generalized types, which we test across statistical learning, inference, and category learning paradigms. We also explore how this system interacts with slower-learning neocortical memory systems, with empirical and modeling investigations into how the hippocampus shapes neocortical representations during sleep. Together, the work helps us understand how structured information in our environment is initially encoded and how it then transforms over time.

SeminarPsychology

A Better Method to Quantify Perceptual Thresholds : Parameter-free, Model-free, Adaptive procedures

Julien Audiffren
University of Fribourg
Feb 28, 2023

The ‘quantification’ of perception is arguably both one of the most important and most difficult aspects of perception study. This is particularly true in visual perception, in which the evaluation of the perceptual threshold is a pillar of the experimental process. The choice of the correct adaptive psychometric procedure, as well as the selection of the proper parameters, is a difficult but key aspect of the experimental protocol. For instance, Bayesian methods such as QUEST, require the a priori choice of a family of functions (e.g. Gaussian), which is rarely known before the experiment, as well as the specification of multiple parameters. Importantly, the choice of an ill-fitted function or parameters will induce costly mistakes and errors in the experimental process. In this talk we discuss the existing methods and introduce a new adaptive procedure to solve this problem, named, ZOOM (Zooming Optimistic Optimization of Models), based on recent advances in optimization and statistical learning. Compared to existing approaches, ZOOM is completely parameter free and model-free, i.e. can be applied on any arbitrary psychometric problem. Moreover, ZOOM parameters are self-tuned, thus do not need to be manually chosen using heuristics (eg. step size in the Staircase method), preventing further errors. Finally, ZOOM is based on state-of-the-art optimization theory, providing strong mathematical guarantees that are missing from many of its alternatives, while being the most accurate and robust in real life conditions. In our experiments and simulations, ZOOM was found to be significantly better than its alternative, in particular for difficult psychometric functions or when the parameters when not properly chosen. ZOOM is open source, and its implementation is freely available on the web. Given these advantages and its ease of use, we argue that ZOOM can improve the process of many psychophysics experiments.

SeminarCognitionRecording

Rethinking Statistical Learning

Morten H. Christiansen
Cornell University
May 16, 2022
SeminarNeuroscienceRecording

Learning the structure and investigating the geometry of complex networks

Robert Peach and Alexis Arnaudon
Imperial College
Sep 23, 2021

Networks are widely used as mathematical models of complex systems across many scientific disciplines, and in particular within neuroscience. In this talk, we introduce two aspects of our collaborative research: (1) machine learning and networks, and (2) graph dimensionality. Machine learning and networks. Decades of work have produced a vast corpus of research characterising the topological, combinatorial, statistical and spectral properties of graphs. Each graph property can be thought of as a feature that captures important (and sometimes overlapping) characteristics of a network. We have developed hcga, a framework for highly comparative analysis of graph data sets that computes several thousands of graph features from any given network. Taking inspiration from hctsa, hcga offers a suite of statistical learning and data analysis tools for automated identification and selection of important and interpretable features underpinning the characterisation of graph data sets. We show that hcga outperforms other methodologies (including deep learning) on supervised classification tasks on benchmark data sets whilst retaining the interpretability of network features, which we exemplify on a dataset of neuronal morphologies images. Graph dimensionality. Dimension is a fundamental property of objects and the space in which they are embedded. Yet ideal notions of dimension, as in Euclidean spaces, do not always translate to physical spaces, which can be constrained by boundaries and distorted by inhomogeneities, or to intrinsically discrete systems such as networks. Deviating from approaches based on fractals, here, we present a new framework to define intrinsic notions of dimension on networks, the relative, local and global dimension. We showcase our method on various physical systems.

SeminarNeuroscience

Sounds Familiar? Statistical Learning of Acoustic Environments

David McAlpine
Macquarie University, Sydney, Australia
Feb 21, 2021
ePoster

Statistical learning yields generalization and naturalistic behaviors in transitive inference

Samuel Lippl, Larry Abbott, Kenneth Kay, Greg Jensen, Vincent Ferrera

COSYNE 2023

ePoster

Pupil dynamics and hippocampal representations reveal fast statistical learning in mice

Adedamola Onih, Abdullah Aziz, Athena Akrami

COSYNE 2025

ePoster

Hippocampus is necessary for implicit statistical learning: Insights from mouse and human pupillometry

Adedamola Onih, Athena Akrami

FENS Forum 2024

ePoster

Network mechanisms for statistical learning and place field formation in the hippocampus

Margaret Lane, Merkourios Simos, James Priestley

FENS Forum 2024

ePoster

Statistical learning in auditory cortex and hippocampus

Xing Xiao, Livia de Hoz

FENS Forum 2024

ePoster

Statistical learning in acute and chronic pain

Jakub Onysk

Neuromatch 5