← Back

Reality

Topic spotlight
TopicWorld Wide

reality

Discover seminars, jobs, and research tagged with reality across World Wide.
60 curated items47 Seminars13 ePosters
Updated 7 months ago
60 items · reality
60 results
SeminarNeuroscienceRecording

Multisensory perception in the metaverse

Polly Dalton
Royal Holloway, University of London
May 7, 2025
SeminarNeuroscience

Learning produces a hippocampal cognitive map in the form of an orthogonalized state machine

Nelson Spruston
Janelia, Ashburn, USA
Mar 5, 2024

Cognitive maps confer animals with flexible intelligence by representing spatial, temporal, and abstract relationships that can be used to shape thought, planning, and behavior. Cognitive maps have been observed in the hippocampus, but their algorithmic form and the processes by which they are learned remain obscure. Here, we employed large-scale, longitudinal two-photon calcium imaging to record activity from thousands of neurons in the CA1 region of the hippocampus while mice learned to efficiently collect rewards from two subtly different versions of linear tracks in virtual reality. The results provide a detailed view of the formation of a cognitive map in the hippocampus. Throughout learning, both the animal behavior and hippocampal neural activity progressed through multiple intermediate stages, gradually revealing improved task representation that mirrored improved behavioral efficiency. The learning process led to progressive decorrelations in initially similar hippocampal neural activity within and across tracks, ultimately resulting in orthogonalized representations resembling a state machine capturing the inherent struture of the task. We show that a Hidden Markov Model (HMM) and a biologically plausible recurrent neural network trained using Hebbian learning can both capture core aspects of the learning dynamics and the orthogonalized representational structure in neural activity. In contrast, we show that gradient-based learning of sequence models such as Long Short-Term Memory networks (LSTMs) and Transformers do not naturally produce such orthogonalized representations. We further demonstrate that mice exhibited adaptive behavior in novel task settings, with neural activity reflecting flexible deployment of the state machine. These findings shed light on the mathematical form of cognitive maps, the learning rules that sculpt them, and the algorithms that promote adaptive behavior in animals. The work thus charts a course toward a deeper understanding of biological intelligence and offers insights toward developing more robust learning algorithms in artificial intelligence.

SeminarPsychology

Conversations with Caves? Understanding the role of visual psychological phenomena in Upper Palaeolithic cave art making

Izzy Wisher
Aarhus University
Feb 25, 2024

How central were psychological features deriving from our visual systems to the early evolution of human visual culture? Art making emerged deep in our evolutionary history, with the earliest art appearing over 100,000 years ago as geometric patterns etched on fragments of ochre and shell, and figurative representations of prey animals flourishing in the Upper Palaeolithic (c. 40,000 – 15,000 years ago). The latter reflects a complex visual process; the ability to represent something that exists in the real world as a flat, two-dimensional image. In this presentation, I argue that pareidolia – the psychological phenomenon of seeing meaningful forms in random patterns, such as perceiving faces in clouds – was a fundamental process that facilitated the emergence of figurative representation. The influence of pareidolia has often been anecdotally observed in Upper Palaeolithic art examples, particularly cave art where the topographic features of cave wall were incorporated into animal depictions. Using novel virtual reality (VR) light simulations, I tested three hypotheses relating to pareidolia in the caves of Upper Palaeolithic cave art in the caves of Las Monedas and La Pasiega (Cantabria, Spain). To evaluate this further, I also developed an interdisciplinary VR eye-tracking experiment, where participants were immersed in virtual caves based on the cave of El Castillo (Cantabria, Spain). Together, these case studies suggest that pareidolia was an intrinsic part of artist-cave interactions (‘conversations’) that influenced the form and placement of figurative depictions in the cave. This has broader implications for conceiving of the role of visual psychological phenomena in the emergence and development of figurative art in the Palaeolithic.

SeminarNeuroscienceRecording

Visual-vestibular cue comparison for perception of environmental stationarity

Paul MacNeilage
University of Nevada, Reno
Oct 25, 2023

Note the later time!

SeminarNeuroscienceRecording

Internal representation of musical rhythm: transformation from sound to periodic beat

Tomas Lenc
Institute of Neuroscience, UCLouvain, Belgium
May 30, 2023

When listening to music, humans readily perceive and move along with a periodic beat. Critically, perception of a periodic beat is commonly elicited by rhythmic stimuli with physical features arranged in a way that is not strictly periodic. Hence, beat perception must capitalize on mechanisms that transform stimulus features into a temporally recurrent format with emphasized beat periodicity. Here, I will present a line of work that aims to clarify the nature and neural basis of this transformation. In these studies, electrophysiological activity was recorded as participants listened to rhythms known to induce perception of a consistent beat across healthy Western adults. The results show that the human brain selectively emphasizes beat representation when it is not acoustically prominent in the stimulus, and this transformation (i) can be captured non-invasively using surface EEG in adult participants, (ii) is already in place in 5- to 6-month-old infants, and (iii) cannot be fully explained by subcortical auditory nonlinearities. Moreover, as revealed by human intracerebral recordings, a prominent beat representation emerges already in the primary auditory cortex. Finally, electrophysiological recordings from the auditory cortex of a rhesus monkey show a significant enhancement of beat periodicities in this area, similar to humans. Taken together, these findings indicate an early, general auditory cortical stage of processing by which rhythmic inputs are rendered more temporally recurrent than they are in reality. Already present in non-human primates and human infants, this "periodized" default format could then be shaped by higher-level associative sensory-motor areas and guide movement in individuals with strongly coupled auditory and motor systems. Together, this highlights the multiplicity of neural processes supporting coordinated musical behaviors widely observed across human cultures.The experiments herein include: a motor timing task comparing the effects of movement vs non-movement with and without feedback (Exp. 1A & 1B), a transcranial magnetic stimulation (TMS) study on the role of the supplementary motor area (SMA) in transforming temporal information (Exp. 2), and a perceptual timing task investigating the effect of noisy movement on time perception with both visual and auditory modalities (Exp. 3A & 3B). Together, the results of these studies support the Bayesian cue combination framework, in that: movement improves the precision of time perception not only in perceptual timing tasks but also motor timing tasks (Exp. 1A & 1B), stimulating the SMA appears to disrupt the transformation of temporal information (Exp. 2), and when movement becomes unreliable or noisy there is no longer an improvement in precision of time perception (Exp. 3A & 3B). Although there is support for the proposed framework, more studies (i.e., fMRI, TMS, EEG, etc.) need to be conducted in order to better understand where and how this may be instantiated in the brain; however, this work provides a starting point to better understanding the intrinsic connection between time and movement

SeminarNeuroscience

The Geometry of Decision-Making

Iain Couzin
University of Konstanz, Germany
May 23, 2023

Running, swimming, or flying through the world, animals are constantly making decisions while on the move—decisions that allow them to choose where to eat, where to hide, and with whom to associate. Despite this most studies have considered only on the outcome of, and time taken to make, decisions. Motion is, however, crucial in terms of how space is represented by organisms during spatial decision-making. Employing a range of new technologies, including automated tracking, computational reconstruction of sensory information, and immersive ‘holographic’ virtual reality (VR) for animals, experiments with fruit flies, locusts and zebrafish (representing aerial, terrestrial and aquatic locomotion, respectively), I will demonstrate that this time-varying representation results in the emergence of new and fundamental geometric principles that considerably impact decision-making. Specifically, we find that the brain spontaneously reduces multi-choice decisions into a series of abrupt (‘critical’) binary decisions in space-time, a process that repeats until only one option—the one ultimately selected by the individual—remains. Due to the critical nature of these transitions (and the corresponding increase in ‘susceptibility’) even noisy brains are extremely sensitive to very small differences between remaining options (e.g., a very small difference in neuronal activity being in “favor” of one option) near these locations in space-time. This mechanism facilitates highly effective decision-making, and is shown to be robust both to the number of options available, and to context, such as whether options are static (e.g. refuges) or mobile (e.g. other animals). In addition, we find evidence that the same geometric principles of decision-making occur across scales of biological organisation, from neural dynamics to animal collectives, suggesting they are fundamental features of spatiotemporal computation.

SeminarNeuroscienceRecording

Are place cells just memory cells? Probably yes

Stefano Fusi
Columbia University, New York
Mar 21, 2023

Neurons in the rodent hippocampus appear to encode the position of the animal in physical space during movement. Individual ``place cells'' fire in restricted sub-regions of an environment, a feature often taken as evidence that the hippocampus encodes a map of space that subserves navigation. But these same neurons exhibit complex responses to many other variables that defy explanation by position alone, and the hippocampus is known to be more broadly critical for memory formation. Here we elaborate and test a theory of hippocampal coding which produces place cells as a general consequence of efficient memory coding. We constructed neural networks that actively exploit the correlations between memories in order to learn compressed representations of experience. Place cells readily emerged in the trained model, due to the correlations in sensory input between experiences at nearby locations. Notably, these properties were highly sensitive to the compressibility of the sensory environment, with place field size and population coding level in dynamic opposition to optimally encode the correlations between experiences. The effects of learning were also strongly biphasic: nearby locations are represented more similarly following training, while locations with intermediate similarity become increasingly decorrelated, both distance-dependent effects that scaled with the compressibility of the input features. Using virtual reality and 2-photon functional calcium imaging in head-fixed mice, we recorded the simultaneous activity of thousands of hippocampal neurons during virtual exploration to test these predictions. Varying the compressibility of sensory information in the environment produced systematic changes in place cell properties that reflected the changing input statistics, consistent with the theory. We similarly identified representational plasticity during learning, which produced a distance-dependent exchange between compression and pattern separation. These results motivate a more domain-general interpretation of hippocampal computation, one that is naturally compatible with earlier theories on the circuit's importance for episodic memory formation. Work done in collaboration with James Priestley, Lorenzo Posani, Marcus Benna, Attila Losonczy.

SeminarNeuroscience

A specialized role for entorhinal attractor dynamics in combining path integration and landmarks during navigation

Malcolm Campbell
Harvard
Mar 8, 2023

During navigation, animals estimate their position using path integration and landmarks. In a series of two studies, we used virtual reality and electrophysiology to dissect how these inputs combine to generate the brain’s spatial representations. In the first study (Campbell et al., 2018), we focused on the medial entorhinal cortex (MEC) and its set of navigationally-relevant cell types, including grid cells, border cells, and speed cells. We discovered that attractor dynamics could explain an array of initially puzzling MEC responses to virtual reality manipulations. This theoretical framework successfully predicted both MEC grid cell responses to additional virtual reality manipulations, as well as mouse behavior in a virtual path integration task. In the second study (Campbell*, Attinger* et al., 2021), we asked whether these principles generalize to other navigationally-relevant brain regions. We used Neuropixels probes to record thousands of neurons from MEC, primary visual cortex (V1), and retrosplenial cortex (RSC). In contrast to the prevailing view that “everything is everywhere all at once,” we identified a unique population of MEC neurons, overlapping with grid cells, that became active with striking spatial periodicity while head-fixed mice ran on a treadmill in darkness. These neurons exhibited unique cue-integration properties compared to other MEC, V1, or RSC neurons: they remapped more readily in response to conflicts between path integration and landmarks; they coded position prospectively as opposed to retrospectively; they upweighted path integration relative to landmarks in conditions of low visual contrast; and as a population, they exhibited a lower-dimensional activity structure. Based on these results, our current view is that MEC attractor dynamics play a privileged role in resolving conflicts between path integration and landmarks during navigation. Future work should include carefully designed causal manipulations to rigorously test this idea, and expand the theoretical framework to incorporate notions of uncertainty and optimality.

SeminarNeuroscienceRecording

Does subjective time interact with the heart rate?

Saeedeh Sadegh
Cornell University, New York
Jan 24, 2023

Decades of research have investigated the relationship between perception of time and heart rate with often mixed results. In search of such a relationship, I will present my far journey between two projects: from time perception in the realistic VR experience of crowded subway trips in the order of minutes (project 1); to the perceived duration of sub-second white noise tones (project 2). Heart rate had multiple concurrent relationships with subjective temporal distortions for the sub-second tones, while the effects were lacking or weak for the supra-minute subway trips. What does the heart have to do with sub-second time perception? We addressed this question with a cardiac drift-diffusion model, demonstrating the sensory accumulation of temporal evidence as a function of heart rate.

SeminarNeuroscienceRecording

Theories of consciousness: beyond the first/higher-order distinction

Jonathan Birch
London School of Economics and Political Science
Sep 8, 2022

Theories of consciousness are commonly grouped into "first-order" and "higher-order" families. As conventional wisdom has it, many more animals are likely to be conscious if a first-order theory is correct. But two recent developments have put pressure on the first/higher-order distinction. One is the argument (from Shea and Frith) that an effective global workspace mechanism must involve a form of metacognition. The second is Lau's "perceptual reality monitoring" (PRM) theory, a member of the "higher-order" family in which conscious sensory content is not re-represented, only tagged with a temporal index and marked as reliable. I argue that the first/higher-order distinction has become so blurred that it is no longer particularly useful. Moreover, the conventional wisdom about animals should not be trusted. It could be, for example, that the distribution of PRM in the animal kingdom is wider than the distribution of global broadcasting.

SeminarNeuroscienceRecording

Virtual Hallucinations to study the Sense of Reality

Roy Salomon
Bar Ilan U
May 23, 2022
SeminarNeuroscienceRecording

Neurocognitive mechanisms of enhanced implicit temporal processing in action video game players

Francois R. Foerster
Giersch Lab, INSERM U1114
Feb 22, 2022

Playing action video games involves both explicit (conscious) and implicit (non-conscious) expectations of timed events, such as the appearance of foes. While studies revealed that explicit attention skills are improved in action video game players (VGPs), their implicit skills remained untested. To this end, we investigated explicit and implicit temporal processing in VGPs and non-VGPs (control participants). In our variable foreperiod task, participants were immersed in a virtual reality and instructed to respond to a visual target appearing at variable delays after a cue. I will present behavioral, oculomotor and EEG data and discuss possible markers of the implicit passage of time and explicit temporal attention processing. All evidence indicates that VGPs have enhanced implicit skills to track the passage of time, which does not require conscious attention. Thus, action video game play may improve a temporal processing found altered in psychopathologies, such as schizophrenia. Could digital (game-based) interventions help remediate temporal processing deficits in psychiatric populations?

SeminarNeuroscience

Attention to visual motion: shaping sensation into perception

Stefan Treue
German Primate Center - Leibniz Institute for Primate Research, Goettingen, Germany
Feb 20, 2022

Evolution has endowed primates, including humans, with a powerful visual system, seemingly providing us with a detailed perception of our surroundings. But in reality the underlying process is one of active filtering, enhancement and reshaping. For visual motion perception, the dorsal pathway in primate visual cortex and in particular area MT/V5 is considered to be of critical importance. Combining physiological and psychophysical approaches we have used the processing and perception of visual motion and area MT/V5 as a model for the interaction of sensory (bottom-up) signals with cognitive (top-down) modulatory influences that characterizes visual perception. Our findings document how this interaction enables visual cortex to actively generate a neural representation of the environment that combines the high-performance sensory periphery with selective modulatory influences for producing an “integrated saliency map’ of the environment.

SeminarNeuroscienceRecording

The effect of gravity on the perception of distance and self-motion: a multisensory perspective

Laurence Harris
Centre for Vision Research, York University, Toronto
Feb 9, 2022

Gravity is a constant in our lives. It provides an internalized reference to which all other perceptions are related. We can experimentally manipulate the relationship between physical gravity with other cues to the direction of “up” using virtual reality - with either HMDs or specially built tilting environments - to explore how gravity contributes to perceptual judgements. The effect of gravity can also be cancelled by running experiments on the International Space Station in low Earth orbit. Changing orientation relative to gravity - or even just perceived orientation – affects your perception of how far away things are (they appear closer when supine or prone). Cancelling gravity altogether has a similar effect. Changing orientation also affects how much visual motion is needed to perceive a particular travel distance (you need less when supine or prone). Adapting to zero gravity has the opposite effect (you need more). These results will be discussed in terms of their practical consequences and the multisensory processes involved, in particular the response to visual-vestibular conflict.

SeminarNeuroscienceRecording

Distance-tuned neurons drive specialized path integration calculations in medial entorhinal cortex

Alexander Attinger
Giocomo lab, Stanford University
Jan 11, 2022

During navigation, animals estimate their position using path integration and landmarks, engaging many brain areas. Whether these areas follow specialized or universal cue integration principles remains incompletely understood. We combine electrophysiology with virtual reality to quantify cue integration across thousands of neurons in three navigation-relevant areas: primary visual cortex (V1), retrosplenial cortex (RSC), and medial entorhinal cortex (MEC). Compared with V1 and RSC, path integration influences position estimates more in MEC, and conflicts between path integration and landmarks trigger remapping more readily. Whereas MEC codes position prospectively, V1 codes position retrospectively, and RSC is intermediate between the two. Lowered visual contrast increases the influence of path integration on position estimates only in MEC. These properties are most pronounced in a population of MEC neurons, overlapping with grid cells, tuned to distance run in darkness. These results demonstrate the specialized role that path integration plays in MEC compared with other navigation-relevant cortical areas.

SeminarNeuroscienceRecording

Deforming the metric of cognitive maps distorts memory

Jacob Bellmund
Doeller lab, MPI CBS and the Kavli Institute
Jan 11, 2022

Environmental boundaries anchor cognitive maps that support memory. However, trapezoidal boundary geometry distorts the regular firing patterns of entorhinal grid cells proposedly providing a metric for cognitive maps. Here, we test the impact of trapezoidal boundary geometry on human spatial memory using immersive virtual reality. Consistent with reduced regularity of grid patterns in rodents and a grid-cell model based on the eigenvectors of the successor representation, human positional memory was degraded in a trapezoid compared to a square environment; an effect particularly pronounced in the trapezoid’s narrow part. Congruent with spatial frequency changes of eigenvector grid patterns, distance estimates between remembered positions were persistently biased; revealing distorted memory maps that explained behavior better than the objective maps. Our findings demonstrate that environmental geometry affects human spatial memory similarly to rodent grid cell activity — thus strengthening the putative link between grid cells and behavior along with their cognitive functions beyond navigation.

SeminarNeuroscience

Body Representation in Virtual Reality

Mel Slater
Universitat de Barcelona
Jan 11, 2022

How the brain represents the body is a fundamental question in cognitive neuroscience. Experimental studies are difficult because ‘the body is always there’ (William James). In recent years immersive virtual reality techniques have been introduced that deliver apparent changes to the body extending earlier techniques such as the rubber hand illusion, or substituting the whole body by a virtual one visually collocated with the real body, and seen from a normal first person perspective. This talk will introduce these techniques, and concentrate on how changing the body can change the mind and behaviour, especially in the context of combatting aggression based on gender or race.

SeminarNeuroscienceRecording

NMC4 Short Talk: Neurocomputational mechanisms of causal inference during multisensory processing in the macaque brain

Guangyao Qi
Institute of Neuroscience, Chinese Academy of Sciences
Dec 2, 2021

Natural perception relies inherently on inferring causal structure in the environment. However, the neural mechanisms and functional circuits that are essential for representing and updating the hidden causal structure during multisensory processing are unknown. To address this, monkeys were trained to infer the probability of a potential common source from visual and proprioceptive signals on the basis of their spatial disparity in a virtual reality system. The proprioceptive drift reported by monkeys demonstrated that they combined historical information and current multisensory signals to estimate the hidden common source and subsequently updated both the causal structure and sensory representation. Single-unit recordings in premotor and parietal cortices revealed that neural activity in premotor cortex represents the core computation of causal inference, characterizing the estimation and update of the likelihood of integrating multiple sensory inputs at a trial-by-trial level. In response to signals from premotor cortex, neural activity in parietal cortex also represents the causal structure and further dynamically updates the sensory representation to maintain consistency with the causal inference structure. Thus, our results indicate how premotor cortex integrates historical information and sensory inputs to infer hidden variables and selectively updates sensory representations in parietal cortex to support behavior. This dynamic loop of frontal-parietal interactions in the causal inference framework may provide the neural mechanism to answer long-standing questions regarding how neural circuits represent hidden structures for body-awareness and agency.

SeminarNeuroscienceRecording

NMC4 Short Talk: Novel population of synchronously active pyramidal cells in hippocampal area CA1

Dori Grijseels (they/them)
University of Sussex
Dec 1, 2021

Hippocampal pyramidal cells have been widely studied during locomotion, when theta oscillations are present, and during short wave ripples at rest, when replay takes place. However, we find a subset of pyramidal cells that are preferably active during rest, in the absence of theta oscillations and short wave ripples. We recorded these cells using two-photon imaging in dorsal CA1 of the hippocampus of mice, during a virtual reality object location recognition task. During locomotion, the cells show a similar level of activity as control cells, but their activity increases during rest, when this population of cells shows highly synchronous, oscillatory activity at a low frequency (0.1-0.4 Hz). In addition, during both locomotion and rest these cells show place coding, suggesting they may play a role in maintaining a representation of the current location, even when the animal is not moving. We performed simultaneous electrophysiological and calcium recordings, which showed a higher correlation of activity between the LFO and the hippocampal cells in the 0.1-0.4 Hz low frequency band during rest than during locomotion. However, the relationship between the LFO and calcium signals varied between electrodes, suggesting a localized effect. We used the Allen Brain Observatory Neuropixels Visual Coding dataset to further explore this. These data revealed localised low frequency oscillations in CA1 and DG during rest. Overall, we show a novel population of hippocampal cells, and a novel oscillatory band of activity in hippocampus during rest.

SeminarNeuroscienceRecording

NMC4 Short Talk: Sensory intermixing of mental imagery and perception

Nadine Dijkstra
Wellcome Centre for Human Neuroimaging
Dec 1, 2021

Several lines of research have demonstrated that internally generated sensory experience - such as during memory, dreaming and mental imagery - activates similar neural representations as externally triggered perception. This overlap raises a fundamental challenge: how is the brain able to keep apart signals reflecting imagination and reality? In a series of online psychophysics experiments combined with computational modelling, we investigated to what extent imagination and perception are confused when the same content is simultaneously imagined and perceived. We found that simultaneous congruent mental imagery consistently led to an increase in perceptual presence responses, and that congruent perceptual presence responses were in turn associated with a more vivid imagery experience. Our findings can be best explained by a simple signal detection model in which imagined and perceived signals are added together. Perceptual reality monitoring can then easily be implemented by evaluating whether this intermixed signal is strong or vivid enough to pass a ‘reality threshold’. Our model suggests that, in contrast to self-generated sensory changes during movement, our brain does not discount self-generated sensory signals during mental imagery. This has profound implications for our understanding of reality monitoring and perception in general.

SeminarNeuroscienceRecording

NMC4 Keynote:

Yuki Kamitani
Kyoto University and ATR
Dec 1, 2021

The brain represents the external world through the bottleneck of sensory organs. The network of hierarchically organized neurons is thought to recover the causes of sensory inputs to reconstruct the reality in the brain in idiosyncratic ways depending on individuals and their internal states. How can we understand the world model represented in an individual’s brain, or the neuroverse? My lab has been working on brain decoding of visual perception and subjective experiences such as imagery and dreaming using machine learning and deep neural network representations. In this talk, I will outline the progress of brain decoding methods and present how subjective experiences are externalized as images and how they could be shared across individuals via neural code conversion. The prospects of these approaches in basic science and neurotechnology will be discussed.

SeminarNeuroscienceRecording

The Geometry of Decision-Making

Iain Couzin
Max Planck Institute of Animal Behavior & University of Konstanz
Oct 7, 2021

Choosing among spatially distributed options is a central challenge for animals, from deciding among alternative potential food sources or refuges, to choosing with whom to associate. Here, using an integrated theoretical and experimental approach (employing immersive Virtual Reality), with both invertebrate and vertebrate models—the fruit fly, desert locust and zebrafish—we consider the recursive interplay between movement and collective vectorial integration in the brain during decision-making regarding options (potential ‘targets’) in space. We reveal that the brain repeatedly breaks multi-choice decisions into a series of abrupt (critical) binary decisions in space-time where organisms switch, spontaneously, from averaging vectorial information among, to suddenly excluding one of, the remaining options. This bifurcation process repeats until only one option—the one ultimately selected—remains. Close to each bifurcation the ‘susceptibility’ of the system exhibits a sharp increase, inevitably causing small differences among the remaining options to become amplified; a property that both comes ‘for free’ and is highly desirable for decision-making. This mechanism facilitates highly effective decision-making, and is shown to be robust both to the number of options available, and to context, such as whether options are static (e.g. refuges) or mobile (e.g. other animals). In addition, we find evidence that the same geometric principles of decision-making occur across scales of biological organisation, from neural dynamics to animal collectives, suggesting they are fundamental features of spatiotemporal computation.

SeminarNeuroscience

What Art can tell us about the Brain

Margaret Livingstone
Harvard
Oct 4, 2021

Artists have been doing experiments on vision longer than neurobiologists. Some major works of art have provided insights as to how we see; some of these insights are so undamental that they can be understood in terms of the underlying neurobiology. For example, artists have long realized that color and luminance can play independent roles in visual perception. Picasso said, "Colors are only symbols. Reality is to be found in luminance alone." This observation has a parallel in the functional subdivision of our visual systems, where color and luminance are processed by the evolutionarily newer, primate-specific What system, and the older, colorblind, Where (or How) system. Many techniques developed over the centuries by artists can be understood in terms of the parallel organization of our visual systems. I will explore how the segregation of color and luminance processing are the basis for why some Impressionist paintings seem to shimmer, why some op art paintings seem to move, some principles of Matisse's use of color, and how the Impressionists painted "air". Central and peripheral vision are distinct, and I will show how the differences in resolution across our visual field make the Mona Lisa's smile elusive, and produce a dynamic illusion in Pointillist paintings, Chuck Close paintings, and photomosaics. I will explore how artists have figured out important features about how our brains extract relevant information about faces and objects, and I will discuss why learning disabilities may be associated with artistic talent.

SeminarPsychology

Psychological essentialism in working memory research

Satoru Saito
Kyoto University
Oct 4, 2021

Psychological essentialism is ubiquitous. It is one of primary bases of thoughts and behaviours throughout our entire lifetime. Human's such characteristics that find an unseen hidden entity behind observable phenomena or exemplars, however, lead us to somehow biased thinking and reasoning even in the realm of science, including psychology. For example, a latent variable extracted from various measurements is just a statistical property calculated in structural equation modeling, therefore, is not necessary to be a fundamental reality. Yet, we occasionally feel that there is the essential nature of such a psychological construct a priori. This talk will demonstrate examples of psychological essentialism in psychology and examine its resultant influences on working memory related issues, e. g., working memory training. Such demonstration, examination, and subsequent discussions on these topics will provide us an opportunity to reconsider the concept of working memory.

SeminarNeuroscienceRecording

Seeing with technology: Exchanging the senses with sensory substitution and augmentation

Michael Proulx
University of Bath
Sep 29, 2021

What is perception? Our sensory modalities transmit information about the external world into electrochemical signals that somehow give rise to our conscious experience of our environment. Normally there is too much information to be processed in any given moment, and the mechanisms of attention focus the limited resources of the mind to some information at the expense of others. My research has advanced from first examining visual perception and attention to now examine how multisensory processing contributes to perception and cognition. There are fundamental constraints on how much information can be processed by the different senses on their own and in combination. Here I will explore information processing from the perspective of sensory substitution and augmentation, and how "seeing" with the ears and tongue can advance fundamental and translational research.

SeminarOpen SourceRecording

Autopilot v0.4.0 - Distributing development of a distributed experimental framework

Jonny Saunders
University of Oregon
Sep 28, 2021

Autopilot is a Python framework for performing complex behavioral neuroscience experiments by coordinating a swarm of Raspberry Pis. It was designed to not only give researchers a tool that allows them to perform the hardware-intensive experiments necessary for the next generation of naturalistic neuroscientific observation, but also to make it easier for scientists to be good stewards of the human knowledge project. Specifically, we designed Autopilot as a framework that lets its users contribute their technical expertise to a cumulative library of hardware interfaces and experimental designs, and produce data that is clean at the time of acquisition to lower barriers to open scientific practices. As autopilot matures, we have been progressively making these aspirations a reality. Currently we are preparing the release of Autopilot v0.4.0, which will include a new plugin system and wiki that makes use of semantic web technology to make a technical and contextual knowledge repository. By combining human readable text and semantic annotations in a wiki that makes contribution as easy as possible, we intend to make a communal knowledge system that gives a mechanism for sharing the contextual technical knowledge that is always excluded from methods sections, but is nonetheless necessary to perform cutting-edge experiments. By integrating it with Autopilot, we hope to make a first of its kind system that allows researchers to fluidly blend technical knowledge and open source hardware designs with the software necessary to use them. Reciprocally, we also hope that this system will support a kind of deep provenance that makes abstract "custom apparatus" statements in methods sections obsolete, allowing the scientific community to losslessly and effortlessly trace a dataset back to the code and hardware designs needed to replicate it. I will describe the basic architecture of Autopilot, recent work on its community contribution ecosystem, and the vision for the future of its development.

SeminarOpen SourceRecording

Creating and controlling visual environments using BonVision

Aman Saleem
University College London
Sep 14, 2021

Real-time rendering of closed-loop visual environments is important for next-generation understanding of brain function and behaviour, but is often prohibitively difficult for non-experts to implement and is limited to few laboratories worldwide. We developed BonVision as an easy-to-use open-source software for the display of virtual or augmented reality, as well as standard visual stimuli. BonVision has been tested on humans and mice, and is capable of supporting new experimental designs in other animal models of vision. As the architecture is based on the open-source Bonsai graphical programming language, BonVision benefits from native integration with experimental hardware. BonVision therefore enables easy implementation of closed-loop experiments, including real-time interaction with deep neural networks, and communication with behavioural and physiological measurement and manipulation devices.

SeminarNeuroscienceRecording

Metacognition for past and future decision making in primates

Kentaro Miyamoto
RIKEN CBS
Sep 2, 2021

As Socrates said that "I know that I know nothing," our mind's function to be aware of our ignorance is essential for abstract and conceptual reasoning. However, the biological mechanism to enable such a hierarchical thought, or meta-cognition, remained unknown. In the first part of the talk, I will demonstrate our studies on the neural mechanism for metacognition on memory in macaque monkeys. In reality, awareness of ignorance is essential not only for the retrospection of the past but also for the exploration of novel unfamiliar environments for the future. However, this proactive feature of metacognition has been understated in neuroscience. In the second part of the talk, I will demonstrate our studies on the neural mechanism for prospective metacognitive matching among uncertain options prior to perceptual decision making in humans and monkeys. These studies converge to suggest that higher-order processes to self-evaluate mental state either retrospectively or prospectively are implemented in the primate neural networks.

SeminarNeuroscience

Neural circuits that support robust and flexible navigation in dynamic naturalistic environments

Hannah Haberkern
HHMI Janelia Research Campus
Aug 15, 2021

Tracking heading within an environment is a fundamental requirement for flexible, goal-directed navigation. In insects, a head-direction representation that guides the animal’s movements is maintained in a conserved brain region called the central complex. Two-photon calcium imaging of genetically targeted neural populations in the central complex of tethered fruit flies behaving in virtual reality (VR) environments has shown that the head-direction representation is updated based on self-motion cues and external sensory information, such as visual features and wind direction. Thus far, the head direction representation has mainly been studied in VR settings that only give flies control of the angular rotation of simple sensory cues. How the fly’s head direction circuitry enables the animal to navigate in dynamic, immersive and naturalistic environments is largely unexplored. I have developed a novel setup that permits imaging in complex VR environments that also accommodate flies’ translational movements. I have previously demonstrated that flies perform visually-guided navigation in such an immersive VR setting, and also that they learn to associate aversive optogenetically-generated heat stimuli with specific visual landmarks. A stable head direction representation is likely necessary to support such behaviors, but the underlying neural mechanisms are unclear. Based on a connectomic analysis of the central complex, I identified likely circuit mechanisms for prioritizing and combining different sensory cues to generate a stable head direction representation in complex, multimodal environments. I am now testing these predictions using calcium imaging in genetically targeted cell types in flies performing 2D navigation in immersive VR.

SeminarNeuroscience

From real problems to beast machines: the somatic basis of selfhood

Anil Seth
University of Sussex
Jun 29, 2021

At the foundation of human conscious experience lie basic embodied experiences of selfhood – experiences of simply ‘being alive’. In this talk, I will make the case that this central feature of human existence is underpinned by predictive regulation of the interior of the body, using the framework of predictive processing, or active inference. I start by showing how conscious experiences of the world around us can be understood in terms of perceptual predictions, drawing on examples from psychophysics and virtual reality. Then, turning the lens inwards, we will see how the experience of being an ‘embodied self’ rests on control-oriented predictive (allostatic) regulation of the body’s physiological condition. This approach implies a deep connection between mind and life, and provides a new way to understand the subjective nature of consciousness as emerging from systems that care intrinsically about their own existence. Contrary to the old doctrine of Descartes, we are conscious because we are beast machines.

SeminarNeuroscience

The effect of gravity on the perception of distance and self-motion

Laurence Harris
Centre for Vision Research, York University, Toronto, Canada
Apr 18, 2021

Gravity is a constant in our lives. It provides an internalized reference to which all other perceptions are related. We can experimentally manipulate the relationship between physical gravity with other cues to the direction of “up” using virtual reality - with either HMDs or specially built tilting environments - to explore how gravity contributes to perceptual judgements. The effect of gravity can also be cancelled by running experiments on the International Space Station in low Earth orbit. Changing orientation relative to gravity - or even just perceived orientation – affects your perception of how far away things are (they appear closer when supine or prone). Cancelling gravity altogether has a similar effect. Changing orientation also affects how much visual motion is needed to perceive a particular travel distance (you need less when supine or prone). Adapting to zero gravity has the opposite effect (you need more). These results will be discussed in terms of their practical consequences and the multisensory processes involved, in particular the response to visual-vestibular conflict.

SeminarNeuroscienceRecording

Cortical networks for flexible decisions during spatial navigation

Christopher Harvey
Harvard University
Feb 18, 2021

My lab seeks to understand how the mammalian brain performs the computations that underlie cognitive functions, including decision-making, short-term memory, and spatial navigation, at the level of the building blocks of the nervous system, cell types and neural populations organized into circuits. We have developed methods to measure, manipulate, and analyze neural circuits across various spatial and temporal scales, including technology for virtual reality, optical imaging, optogenetics, intracellular electrophysiology, molecular sensors, and computational modeling. I will present recent work that uses large scale calcium imaging to reveal the functional organization of the mouse posterior cortex for flexible decision-making during spatial navigation in virtual reality. I will also discuss work that uses optogenetics and calcium imaging during a variety of decision-making tasks to highlight how cognitive experience and context greatly alter the cortical circuits necessary for navigation decisions.

SeminarNeuroscience

From oscillations to laminar responses - characterising the neural circuitry of autobiographical memories

Eleanor Maguire
Wellcome Centre for Human Neuroimaging at UCL
Nov 30, 2020

Autobiographical memories are the ghosts of our past. Through them we visit places long departed, see faces once familiar, and hear voices now silent. These, often decades-old, personal experiences can be recalled on a whim or come unbidden into our everyday consciousness. Autobiographical memories are crucial to cognition because they facilitate almost everything we do, endow us with a sense of self and underwrite our capacity for autonomy. They are often compromised by common neurological and psychiatric pathologies with devastating effects. Despite autobiographical memories being central to everyday mental life, there is no agreed model of autobiographical memory retrieval, and we lack an understanding of the neural mechanisms involved. This precludes principled interventions to manage or alleviate memory deficits, and to test the efficacy of treatment regimens. This knowledge gap exists because autobiographical memories are challenging to study – they are immersive, multi-faceted, multi-modal, can stretch over long timescales and are grounded in the real world. One missing piece of the puzzle concerns the millisecond neural dynamics of autobiographical memory retrieval. Surprisingly, there are very few magnetoencephalography (MEG) studies examining such recall, despite the important insights this could offer into the activity and interactions of key brain regions such as the hippocampus and ventromedial prefrontal cortex. In this talk I will describe a series of MEG studies aimed at uncovering the neural circuitry underpinning the recollection of autobiographical memories, and how this changes as memories age. I will end by describing our progress on leveraging an exciting new technology – optically pumped MEG (OP-MEG) which, when combined with virtual reality, offers the opportunity to examine millisecond neural responses from the whole brain, including deep structures, while participants move within a virtual environment, with the attendant head motion and vestibular inputs.

SeminarNeuroscienceRecording

Human cognitive biases and the role of dopamine

Makiko Yamada
National Institutes for Quantum and Radiological Science and Technology
Nov 26, 2020

Cognitive bias is a "subjective reality" that is uniquely created in the brain and affects our various behaviors. It may lead to what is widely called irrationality in behavioral economics, such as inaccurate judgment and illogical interpretation, but it also has an adaptive aspect in terms of mental hygiene. When such cognitive bias is regarded as a product of information processing in the brain, the approach to clarify the mechanism in the brain will play a part in finding the direct relations between the brain and the mind. In my talk, I will introduce our studies investigating the neural and molecular bases of cognitive biases, especially focusing on the role of dopamine.

SeminarNeuroscience

Experience dependent changes of sensory representation in the olfactory cortex

Antonia Marin Burgin
Biomedicine Research Institute of Buenos Aires
Nov 17, 2020

Sensory representations are typically thought as neuronal activity patterns that encode physical attributes of the outside world. However, increasing evidence is showing that as animals learned the association between a sensory stimulus and its behavioral relevance, stimulus representation in sensory cortical areas can change. In this seminar I will present recent experiments from our lab showing that the activity in the olfactory piriform cortex (PC) of mice encodes not only odor information, but also non-olfactory variables associated with the behavioral task. By developing an associative olfactory learning task, in which animals learn to associate a particular context with an odor and a reward, we were able to record the activity of multiple neurons as the animal runs in a virtual reality corridor. By analyzing the population activity dynamics using Principal Components Analysis, we find different population trajectories evolving through time that can discriminate aspects of different trial types. By using Generalized Linear Models we further dissected the contribution of different sensory and non-sensory variables to the modulation of PC activity. Interestingly, the experiments show that variables related to both sensory and non-sensory aspects of the task (e.g., odor, context, reward, licking, sniffing rate and running speed) differently modulate PC activity, suggesting that the PC adapt odor processing depending on experience and behavior.

SeminarPhysics of Life

Motility control in biological microswimmers

Kirsty Wan
University of Exeter
Sep 29, 2020

It is often assumed that biological swimmers conform faithfully to certain stereotypes assigned to them by physicists and mathematicians, when the reality is in fact much more complicated. In this talk we will use a combination of theory, experiments, and robotics, to understand the physical and evolutionary basis of motility control in a number of distinguished organisms. These organisms differ markedly in terms of their size, shape, and arrangement of locomotor appendages, but are united in their use of cilia - the ultimate shape-shifting organelle - to achieve self-propulsion and navigation.

SeminarNeuroscience

Revealing the neural basis of human memory with direct recordings of place and grid cells and traveling waves

Joshua Jacobs
Columbia University
May 12, 2020

The ability to remember spatial environments is critical for everyday life. In this talk, I will discuss my lab’s findings on how the human brain supports spatial memory and navigation based on our experiments with direct brain recordings from neurosurgical patients performing virtual-reality spatial memory tasks. I will show that humans have a network of neurons that represent where we are located and trying to go. This network includes some cell types that are similar to those seen in animals, such as place and grid cells, as well as others that have not been seen before in animals, such as anchor and spatial-target cells. I also will explore the role of network oscillations in human memory, where humans again show several distinctive patterns compared to animals. Whereas rodents generally show a hippocampal oscillation at ~8Hz, humans have two separate hippocampal oscillations, at low and high frequencies, which support memory and navigation, respectively. Finally, I will show that neural oscillations in humans are traveling waves, propagating across the cortex, to coordinate the timing of neuronal activity across regions, which is another property not seen in animals. A theme from this work is that in terms of navigation and memory the human brain has novel characteristics compared with animals, which helps explain our rich behavioural abilities and has implications for treating disease and neurological disorders.

ePoster

Analysis of gaze control neuronal circuits combining behavioural experiments with a novel virtual reality platform

Carmen Núñez-González, Marta Barandela, Cecilia Jiménez-López, Abraham Segade, Juan Pérez-Fernández

FENS Forum 2024

ePoster

Blurring the line between imagination and reality: Motor imagery influences performance of linked movements

Magdalena Gippert, Pei-Cheng Shih, Tobias Heed, Ian Howard, Mina Jamshidi, Arno Villringer, Bernhard Sehm, Vadim Nikulin

FENS Forum 2024

ePoster

Comparison of acetylcholine release in the mouse cerebral cortex in response to standard visual stimuli vs dynamic virtual reality environment

Julie Azrak, Hossein Sedighi, Jose Daniel Tirado Ramirez, Yulong Li, Elvire Vaucher

FENS Forum 2024

ePoster

Effectiveness of action observation treatment integrated with virtual reality in the motor rehabilitation of stroke patients: A randomized controlled clinical trial

Antonino Errante, Donatella Saviola, Matteo Cantoni, Katia Iannuzzelli, Settimio Ziccarelli, Fabrizio Togni, Marcello Simonini, Carolina Malchiodi, Debora Bertoni, Maria Grazia Inzaghi, Francesca Bozzetti, Annamaria Quarenghi, Paola Quarenghi, Daniele Bosone, Leonardo Fogassi, Giovanni Pietro Salvi, Antonio De Tanti

FENS Forum 2024

ePoster

Feasibility and compatibility of combining virtual reality and transcranial magnetic stimulation

Franka Arden, Phil Henneken, Andreas Vlachos

FENS Forum 2024

ePoster

Hippocampal place field formation by sparse, local learning of visual features in virtual reality

Olivier Ulrich, Lorenzo Posani, Attila Losonczy, Stefano Fusi, James Priestley

FENS Forum 2024

ePoster

The impact of virtual reality on postoperative cognitive impairment and pain perception after surgery

Sebastian Isac, Andrada-Georgiana Badea, Ana-Maria Zagrean, Elisabeta Nita, Diana Irene Mihai, Damiana Ojog, Pavel Bogdan, Teodora Isac, Gabriela Droc

FENS Forum 2024

ePoster

Modulation of brain activity by environmental design: A study using EEG and virtual reality

Jesus S. Garcia Salinas, Anna Wroblewska, Katarzyna Zielonko-Jung, Michał Kucewicz

FENS Forum 2024

ePoster

Multimodal activity of mouse auditory cortex during audio-visual-motor virtual reality

Alessandro La Chioma, David Schneider

FENS Forum 2024

ePoster

Virtual reality empowered deep learning analysis of brain cells

Doris Kaltenecker, Rami Al-Maskari, Moritz Negwer, Luciano Hoeher, Kofler Florian, Shan Zhao, Mihail Todorov, Zhouyi Rong, Johannes Christian Paetzold, Benedikt Wiestler, Marie Piraud, Daniel Rueckert, Julia Geppert, Pauline Morigny, Maria Rohm, Bjoern H. Menze, Stephan Herzig, Mauricio Berriel Diaz, Ali Ertürk

FENS Forum 2024

ePoster

A virtual-reality task to investigate multisensory object recognition in mice

Veronique Stokkers, Guido T Meijer, Smit Zayel, Jeroen J Bos, Francesco P Battaglia

FENS Forum 2024

ePoster

Visual feedback manipulation in virtual reality alters movement-evoked pain perception in chronic low back pain

Jaime Jordán López, María D. Arguisuelas, Julio Doménech, María L. Peñalver-Barrios, Marta Miragall, Rocío Herrero, Rosa M. Baños, Juan J. Amer-Cuenca, Juan F. Lisón

FENS Forum 2024

ePoster

‘What a Mistake!’: Prediction error modulates explicit and visuomotor predictions in virtual reality

Yonatan Stern

Neuromatch 5