ePoster

On the limits of visual cortical prosthesis resolution

Juan Fuentesand 6 co-authors
FENS Forum 2024 (2024)
Messe Wien Exhibition & Congress Center, Vienna, Austria

Presentation

Date TBA

Poster preview

On the limits of visual cortical prosthesis resolution poster preview

Event Information

Abstract

Current approaches to cortical prosthetic vision rely on electrically evoked phosphenes, whose dynamic appearance in the visual field could enable functional sight in acquired blindness. Akin to an input/output dynamic-range problem, the spatiotemporal resolution of phosphene-based vision is constrained by the accuracy with which activity can be modulated in V1 and simultaneously, by the discriminability of evoked activity in higher visual areas (i.e. V2 and V4), for different stimulation parameters. However, the high-dimensionality of the stimulation parameter space makes impractical to extensively test stimulation patterns experimentally. In order to overcome these experimental constraints, this study aims to explore in-silico the achievable resolution of cortical visual prostheses, by combining biophysical modelling and neural decoding. To this end, we develop a neural network model based on biophysical data from the macaque visual cortex, and conduct simulations over a wide range of stimulation parameters, in order to relate temporal characteristics of stimulation patterns to the potential spatial resolution achievable by propagated activity, from V1 to V4. The assessment of propagated information is made through a machine-learning decoder that aims to discriminate electrical stimulation parameters from neural responses in higher visual areas. Our findings comprise an evaluation of the amount of information that neural responses carry about electrical stimulation parameters, and help to elucidate biophysical limits that cortical visual prosthesis could potentially achieve, serving as a-priori knowledge for experimental research, based on biophysics and information theoretical quantities.

Cookies

We use essential cookies to run the site. Analytics cookies are optional and help us improve World Wide. Learn more.