ePoster

Diffusion Tempering Improves Parameter Estimation with Probabilistic Integrators for Hodgkin Huxley Models

Jonas Beck, Nathanael Bosch, Michael Deistler, Kyra Khadim, Jakob Macke, Philipp Hennig, Philipp Berens
Bernstein Conference 2024(2024)
Goethe University, Frankfurt, Germany

Conference

Bernstein Conference 2024

Goethe University, Frankfurt, Germany

Resources

Authors & Affiliations

Jonas Beck, Nathanael Bosch, Michael Deistler, Kyra Khadim, Jakob Macke, Philipp Hennig, Philipp Berens

Abstract

In neuroscience, ordinary differential equations (ODEs) are widely used to simulate the dynamics of neural systems, but identifying parameters that explain experimental measurements is challenging. In particular, while these models are differentiable and would allow for gradient-based parameter optimization, the nonlinear dynamics inherent of neural systems often lead to many local minima and extreme sensitivity to initial conditions. To address this, we improve a recently developed probabilistic integration method that enables gradient-based parameter inference in ODEs known as ``Physics-Enhanced Regression for Initial Value Problems'', or Fenrir for short [1]. Fenrir estimates a probabilistic numerical proxy of the marginal likelihood that takes the numerical uncertainty of the ODE solver and that of the data into account, making its parameter estimates more robust than classical alternatives. Building on this framework, we develop \emph{diffusion tempering}, a novel regularization technique for optimization of the Fenrir marginal likelihood. This is based on previous observations that probabilistic ODE solvers can effectively `smooth out' the loss surface of ODEs [1]. Diffusion tempering makes use of this ability by iteratively reducing a noise parameter of the probabilistic integrator throughout several consecutive optimization problems. Diffusion tempering initially yields a very smooth loss surface, which offers poor fits to data, but lets the optimizer avoid local minima. By successively solving less and less smooth problems informed by previous parameter estimates, diffusion tempering more reliably converges in the vicinity of the true parameters. We demonstrate that this produces better parameter estimates than both classical least-squares regression and the original Fenrir method for popular single and small multi-compartment models of cortical and thalamic neurons [2] of increasing complexity. Diffusion tempering converges more consistently and returns parameter estimates that retain electrophysiological features of the data much better, even in regimes in which gradient-based parameter inference is challenging for classical methods.

Unique ID: bernstein-24/diffusion-tempering-improves-parameter-a8c3b60d