Statistical Learning
statistical learning
Flavia Mancini
1 Postdoc: Simulating & modelling neural dynamics involved in statistical/aversive learning and homeostatic/pain regulation with the scope to develop new projects. 1 Research Assistant: Conducting behavioral and neuroimaging experiments.
Nicola Torelli
Research fellowship opportunity in 'Methods and models for artificial intelligence and statistical and automatic learning for the innovation of business processes and decisions' (Metodi e modelli per l'intelligenza artificiale e l'apprendimento statistico e automatico per l'innovazione dei processi e delle decisioni aziendali). The fellowship is for 1 year, with the possibility of extension. The research grant is issued at the Department of Economics, Business, Mathematics and Statistics (DEAMS) of the University of Trieste (Italy), with the scientific coordinator being Prof. Nicola Torelli.
Tiago de Paula Peixoto
Call for 5 open-rank positions at IT:U — a new public university just founded in Austria. One of the focuses is Theoretical Foundations of Data Science — engaging with areas such as mathematics of data science, statistical learning, or specific topics like topological data analysis and causality. The concept of “Data Science” here is very broadly defined. The positions are attractive, and include permanent (i.e. recurring) funding for a number of PhD students and post-docs, depending on rank.
Learning representations of specifics and generalities over time
There is a fundamental tension between storing discrete traces of individual experiences, which allows recall of particular moments in our past without interference, and extracting regularities across these experiences, which supports generalization and prediction in similar situations in the future. One influential proposal for how the brain resolves this tension is that it separates the processes anatomically into Complementary Learning Systems, with the hippocampus rapidly encoding individual episodes and the neocortex slowly extracting regularities over days, months, and years. But this does not explain our ability to learn and generalize from new regularities in our environment quickly, often within minutes. We have put forward a neural network model of the hippocampus that suggests that the hippocampus itself may contain complementary learning systems, with one pathway specializing in the rapid learning of regularities and a separate pathway handling the region’s classic episodic memory functions. This proposal has broad implications for how we learn and represent novel information of specific and generalized types, which we test across statistical learning, inference, and category learning paradigms. We also explore how this system interacts with slower-learning neocortical memory systems, with empirical and modeling investigations into how the hippocampus shapes neocortical representations during sleep. Together, the work helps us understand how structured information in our environment is initially encoded and how it then transforms over time.
A Better Method to Quantify Perceptual Thresholds : Parameter-free, Model-free, Adaptive procedures
The ‘quantification’ of perception is arguably both one of the most important and most difficult aspects of perception study. This is particularly true in visual perception, in which the evaluation of the perceptual threshold is a pillar of the experimental process. The choice of the correct adaptive psychometric procedure, as well as the selection of the proper parameters, is a difficult but key aspect of the experimental protocol. For instance, Bayesian methods such as QUEST, require the a priori choice of a family of functions (e.g. Gaussian), which is rarely known before the experiment, as well as the specification of multiple parameters. Importantly, the choice of an ill-fitted function or parameters will induce costly mistakes and errors in the experimental process. In this talk we discuss the existing methods and introduce a new adaptive procedure to solve this problem, named, ZOOM (Zooming Optimistic Optimization of Models), based on recent advances in optimization and statistical learning. Compared to existing approaches, ZOOM is completely parameter free and model-free, i.e. can be applied on any arbitrary psychometric problem. Moreover, ZOOM parameters are self-tuned, thus do not need to be manually chosen using heuristics (eg. step size in the Staircase method), preventing further errors. Finally, ZOOM is based on state-of-the-art optimization theory, providing strong mathematical guarantees that are missing from many of its alternatives, while being the most accurate and robust in real life conditions. In our experiments and simulations, ZOOM was found to be significantly better than its alternative, in particular for difficult psychometric functions or when the parameters when not properly chosen. ZOOM is open source, and its implementation is freely available on the web. Given these advantages and its ease of use, we argue that ZOOM can improve the process of many psychophysics experiments.
Applying Structural Alignment theory to Early Verb Learning
Learning verbs is difficult and critical to learning one's native language. Children appear to benefit from seeing multiple events and comparing them to each other, and structural alignment theory provides a good theoretical framework to guide research into how preschool children may be comparing events as they learn new verbs. The talk will include 6 studies of early verb learning that make use of eye-tracking procedures as well as other behavioral (pointing) procedures, and that test key predictions from SA theory including the prediction that seeing similar examples before more varied examples helps observers learn how to compare (progressive alignment) and the prediction that when events have very low alignability with other events, that is one cue that the events should be ignored. Whether or how statistical learning may also be at work will be considered.
Rethinking Statistical Learning
Learning the structure and investigating the geometry of complex networks
Networks are widely used as mathematical models of complex systems across many scientific disciplines, and in particular within neuroscience. In this talk, we introduce two aspects of our collaborative research: (1) machine learning and networks, and (2) graph dimensionality. Machine learning and networks. Decades of work have produced a vast corpus of research characterising the topological, combinatorial, statistical and spectral properties of graphs. Each graph property can be thought of as a feature that captures important (and sometimes overlapping) characteristics of a network. We have developed hcga, a framework for highly comparative analysis of graph data sets that computes several thousands of graph features from any given network. Taking inspiration from hctsa, hcga offers a suite of statistical learning and data analysis tools for automated identification and selection of important and interpretable features underpinning the characterisation of graph data sets. We show that hcga outperforms other methodologies (including deep learning) on supervised classification tasks on benchmark data sets whilst retaining the interpretability of network features, which we exemplify on a dataset of neuronal morphologies images. Graph dimensionality. Dimension is a fundamental property of objects and the space in which they are embedded. Yet ideal notions of dimension, as in Euclidean spaces, do not always translate to physical spaces, which can be constrained by boundaries and distorted by inhomogeneities, or to intrinsically discrete systems such as networks. Deviating from approaches based on fractals, here, we present a new framework to define intrinsic notions of dimension on networks, the relative, local and global dimension. We showcase our method on various physical systems.
Sounds Familiar? Statistical Learning of Acoustic Environments
Statistical learning yields generalization and naturalistic behaviors in transitive inference
COSYNE 2023
Pupil dynamics and hippocampal representations reveal fast statistical learning in mice
COSYNE 2025
Hippocampus is necessary for implicit statistical learning: Insights from mouse and human pupillometry
FENS Forum 2024
Network mechanisms for statistical learning and place field formation in the hippocampus
FENS Forum 2024
Statistical learning in auditory cortex and hippocampus
FENS Forum 2024
Statistical learning in acute and chronic pain
Neuromatch 5