ePoster

Hida-Matern Gaussian Processes

Matthew Dowling,Piotr Sokol,Memming Park
COSYNE 2022(2022)
Lisbon, Portugal
Presented: Mar 18, 2022

Conference

COSYNE 2022

Lisbon, Portugal

Resources

Authors & Affiliations

Matthew Dowling,Piotr Sokol,Memming Park

Abstract

Bayesian data analysis using probabilistic modeling explicitly utilizes one's a priori scientific beliefs as structured prior distributions. However, scientific inference often requires some of our prior knowledge only be embedded in a `broad sense', take for example inferring latent dynamics underlying neural population activity; nonparametric approaches such as Gaussian Processes (GPs) are highly flexible, expressive, and allow us to encode broad assumptions such as periodicity, stationarity, and smoothness. Though, these properties make GPs attractive for neural data analysis, their computational overhead, scaling cubicly with the number of data points, dilutes their applicability to large scale problems. We introduce the Hida-Matern (HM) kernel, a basis over all stationary GP covariance functions. We show how to leverage the GP state-space model (SSM) representation to achieve fast inference for common probabilistic models in neuroscience that embed prior beliefs via GP priors. We showcase the strengths of the SSM formulation of GPs in latent-variable-modeling (LVM) of neural dynamics, Poisson regression, and intensity estimation of neural point-processes. In addition, we also show how the GP-SSM representation links popular methods such as GPFA and vLGP to linear SSMs; implying limitations in the beliefs about neural dynamics that one can specify with these models a priori. Thus, any model specifying latent trajectories under a stationary GP prior is ill suited for inferring neural dynamics that are nonlinear.

Unique ID: cosyne-22/hidamatern-gaussian-processes-4b7ad5aa