← Back

Bayesian Methods

Topic spotlight
TopicWorld Wide

bayesian methods

Discover seminars, jobs, and research tagged with bayesian methods across World Wide.
5 curated items4 Positions1 Seminar
Updated 3 days ago
5 items · bayesian methods
5 results
Position

Kerstin Bunte

Bernoulli Institute for Mathematics, Computer Science and Artificial Intelligence, Faculty of Science and Engineering, University of Groningen
University of Groningen, The Netherlands
Dec 5, 2025

We offer a postdoctoral researcher position within the Bernoulli Institute for Mathematics, Computer Science and Artificial Intelligence at the University of Groningen, The Netherlands. The position is funded by an NWO Vidi project named “mechanistic machine learning: combining the explanatory power of dynamic models with the predictive power of machine learning“. Systems of Artificial Intelligence (AI) and Machine Learning (ML) gained a tremendous amount of interest in recent years, demonstrating great performance for a wide variety of tasks, but typically only if they are trained on huge amounts of data. Moreover, frequently no insight into the decision making is available or required. Experts desire to know how their data can inform them about the natural processes being measured. Therefore we develop transparent and interpretable model- and data-driven hybrid methods that are demonstrated for applications in medicine and engineering. As a postdoc, you will work together with Kerstin Bunte and her team within the Intelligent Systems group, as well as a network of interdisciplinary collaborators in the UK and Europe from various fields, such as Computer Science, Engineering and Applied Mathematics.

Position

I-Chun Lin, PhD

Gatsby Computational Neuroscience Unit, UCL
Gatsby Computational Neuroscience Unit, UCL
Dec 5, 2025

The Gatsby Computational Neuroscience Unit is a leading research centre focused on theoretical neuroscience and machine learning. We study (un)supervised and reinforcement learning in brains and machines; inference, coding and neural dynamics; Bayesian and kernel methods, and deep learning; with applications to the analysis of perceptual processing and cognition, neural data, signal and image processing, machine vision, network data and nonparametric hypothesis testing. The Unit provides a unique opportunity for a critical mass of theoreticians to interact closely with one another and with researchers at the Sainsbury Wellcome Centre for Neural Circuits and Behaviour (SWC), the Centre for Computational Statistics and Machine Learning (CSML) and related UCL departments such as Computer Science; Statistical Science; Artificial Intelligence; the ELLIS Unit at UCL; Neuroscience; and the nearby Alan Turing and Francis Crick Institutes. Our PhD programme provides a rigorous preparation for a research career. Students complete a 4-year PhD in either machine learning or theoretical/computational neuroscience, with minor emphasis in the complementary field. Courses in the first year provide a comprehensive introduction to both fields and systems neuroscience. Students are encouraged to work and interact closely with SWC/CSML researchers to take advantage of this uniquely multidisciplinary research environment.

Position

I-Chun Lin

Gatsby Computational Neuroscience Unit, UCL
Gatsby Computational Neuroscience Unit, UCL
Dec 5, 2025

The Gatsby Computational Neuroscience Unit is a leading research centre focused on theoretical neuroscience and machine learning. We study (un)supervised and reinforcement learning in brains and machines; inference, coding and neural dynamics; Bayesian and kernel methods, and deep learning; with applications to the analysis of perceptual processing and cognition, neural data, signal and image processing, machine vision, network data and nonparametric hypothesis testing. The Unit provides a unique opportunity for a critical mass of theoreticians to interact closely with one another and with researchers at the Sainsbury Wellcome Centre for Neural Circuits and Behaviour (SWC), the Centre for Computational Statistics and Machine Learning (CSML) and related UCL departments such as Computer Science; Statistical Science; Artificial Intelligence; the ELLIS Unit at UCL; Neuroscience; and the nearby Alan Turing and Francis Crick Institutes. Our PhD programme provides a rigorous preparation for a research career. Students complete a 4-year PhD in either machine learning or theoretical/computational neuroscience, with minor emphasis in the complementary field. Courses in the first year provide a comprehensive introduction to both fields and systems neuroscience. Students are encouraged to work and interact closely with SWC/CSML researchers to take advantage of this uniquely multidisciplinary research environment.

SeminarPsychology

A Better Method to Quantify Perceptual Thresholds : Parameter-free, Model-free, Adaptive procedures

Julien Audiffren
University of Fribourg
Feb 28, 2023

The ‘quantification’ of perception is arguably both one of the most important and most difficult aspects of perception study. This is particularly true in visual perception, in which the evaluation of the perceptual threshold is a pillar of the experimental process. The choice of the correct adaptive psychometric procedure, as well as the selection of the proper parameters, is a difficult but key aspect of the experimental protocol. For instance, Bayesian methods such as QUEST, require the a priori choice of a family of functions (e.g. Gaussian), which is rarely known before the experiment, as well as the specification of multiple parameters. Importantly, the choice of an ill-fitted function or parameters will induce costly mistakes and errors in the experimental process. In this talk we discuss the existing methods and introduce a new adaptive procedure to solve this problem, named, ZOOM (Zooming Optimistic Optimization of Models), based on recent advances in optimization and statistical learning. Compared to existing approaches, ZOOM is completely parameter free and model-free, i.e. can be applied on any arbitrary psychometric problem. Moreover, ZOOM parameters are self-tuned, thus do not need to be manually chosen using heuristics (eg. step size in the Staircase method), preventing further errors. Finally, ZOOM is based on state-of-the-art optimization theory, providing strong mathematical guarantees that are missing from many of its alternatives, while being the most accurate and robust in real life conditions. In our experiments and simulations, ZOOM was found to be significantly better than its alternative, in particular for difficult psychometric functions or when the parameters when not properly chosen. ZOOM is open source, and its implementation is freely available on the web. Given these advantages and its ease of use, we argue that ZOOM can improve the process of many psychophysics experiments.