Topic spotlight
TopicWorld Wide

VR

Discover seminars, jobs, and research tagged with VR across World Wide.
19 curated items12 Seminars6 ePosters1 Position
Updated 2 days ago
19 items · VR
19 results
SeminarPsychology

Conversations with Caves? Understanding the role of visual psychological phenomena in Upper Palaeolithic cave art making

Izzy Wisher
Aarhus University
Feb 25, 2024

How central were psychological features deriving from our visual systems to the early evolution of human visual culture? Art making emerged deep in our evolutionary history, with the earliest art appearing over 100,000 years ago as geometric patterns etched on fragments of ochre and shell, and figurative representations of prey animals flourishing in the Upper Palaeolithic (c. 40,000 – 15,000 years ago). The latter reflects a complex visual process; the ability to represent something that exists in the real world as a flat, two-dimensional image. In this presentation, I argue that pareidolia – the psychological phenomenon of seeing meaningful forms in random patterns, such as perceiving faces in clouds – was a fundamental process that facilitated the emergence of figurative representation. The influence of pareidolia has often been anecdotally observed in Upper Palaeolithic art examples, particularly cave art where the topographic features of cave wall were incorporated into animal depictions. Using novel virtual reality (VR) light simulations, I tested three hypotheses relating to pareidolia in the caves of Upper Palaeolithic cave art in the caves of Las Monedas and La Pasiega (Cantabria, Spain). To evaluate this further, I also developed an interdisciplinary VR eye-tracking experiment, where participants were immersed in virtual caves based on the cave of El Castillo (Cantabria, Spain). Together, these case studies suggest that pareidolia was an intrinsic part of artist-cave interactions (‘conversations’) that influenced the form and placement of figurative depictions in the cave. This has broader implications for conceiving of the role of visual psychological phenomena in the emergence and development of figurative art in the Palaeolithic.

SeminarPsychology

Use of Artificial Intelligence by Law Enforcement Authorities in the EU

Vangelis Zarkadoulas
Cyber & Data Security Lab, Vrije Universiteit Brussel
Oct 29, 2023

Recently, artificial intelligence (AI) has become a global priority. Rapid and ongoing technological advancements in AI have prompted European legislative initiatives to regulate its use. In April 2021, the European Commission submitted a proposal for a Regulation that would harmonize artificial intelligence rules across the EU, including the law enforcement sector. Consequently, law enforcement officials await the outcome of the ongoing inter-institutional negotiations (trilogue) with great anticipation, as it will define how to capitalize on the opportunities presented by AI and how to prevent criminals from abusing this emergent technology.

SeminarNeuroscienceRecording

Anticipating behaviour through working memory (BACN Early Career Prize Lecture 2023)

Freek van Ede
Vrije Universiteit Amsterdam, Netherlands
Sep 11, 2023

Working memory is about the past but for the future. Adopting such a future-focused perspective shifts the narrative of working memory as a limited-capacity storage system to working memory as an anticipatory buffer that helps us prepare for potential and sequential upcoming behaviour. In my talk, I will present a series of our recent studies that have started to reveal emerging principles of a working memory that looks forward – highlighting, amongst others, how selective attention plays a vital role in prioritising internal contents for behaviour, and the bi-directional links between visual working memory and action. These studies show how studying the dynamics of working memory, selective attention, and action together paves way for an integrated understanding of how mind serves behaviour.

SeminarNeuroscience

The Geometry of Decision-Making

Iain Couzin
University of Konstanz, Germany
May 23, 2023

Running, swimming, or flying through the world, animals are constantly making decisions while on the move—decisions that allow them to choose where to eat, where to hide, and with whom to associate. Despite this most studies have considered only on the outcome of, and time taken to make, decisions. Motion is, however, crucial in terms of how space is represented by organisms during spatial decision-making. Employing a range of new technologies, including automated tracking, computational reconstruction of sensory information, and immersive ‘holographic’ virtual reality (VR) for animals, experiments with fruit flies, locusts and zebrafish (representing aerial, terrestrial and aquatic locomotion, respectively), I will demonstrate that this time-varying representation results in the emergence of new and fundamental geometric principles that considerably impact decision-making. Specifically, we find that the brain spontaneously reduces multi-choice decisions into a series of abrupt (‘critical’) binary decisions in space-time, a process that repeats until only one option—the one ultimately selected by the individual—remains. Due to the critical nature of these transitions (and the corresponding increase in ‘susceptibility’) even noisy brains are extremely sensitive to very small differences between remaining options (e.g., a very small difference in neuronal activity being in “favor” of one option) near these locations in space-time. This mechanism facilitates highly effective decision-making, and is shown to be robust both to the number of options available, and to context, such as whether options are static (e.g. refuges) or mobile (e.g. other animals). In addition, we find evidence that the same geometric principles of decision-making occur across scales of biological organisation, from neural dynamics to animal collectives, suggesting they are fundamental features of spatiotemporal computation.

SeminarNeuroscienceRecording

Does subjective time interact with the heart rate?

Saeedeh Sadegh
Cornell University, New York
Jan 24, 2023

Decades of research have investigated the relationship between perception of time and heart rate with often mixed results. In search of such a relationship, I will present my far journey between two projects: from time perception in the realistic VR experience of crowded subway trips in the order of minutes (project 1); to the perceived duration of sub-second white noise tones (project 2). Heart rate had multiple concurrent relationships with subjective temporal distortions for the sub-second tones, while the effects were lacking or weak for the supra-minute subway trips. What does the heart have to do with sub-second time perception? We addressed this question with a cardiac drift-diffusion model, demonstrating the sensory accumulation of temporal evidence as a function of heart rate.

SeminarNeuroscience

From natural scene statistics to multisensory integration: experiments, models and applications

Cesare Parise
Oculus VR
Feb 8, 2022

To efficiently process sensory information, the brain relies on statistical regularities in the input. While generally improving the reliability of sensory estimates, this strategy also induces perceptual illusions that help reveal the underlying computational principles. Focusing on auditory and visual perception, in my talk I will describe how the brain exploits statistical regularities within and across the senses for the perception space, time and multisensory integration. In particular, I will show how results from a series of psychophysical experiments can be interpreted in the light of Bayesian Decision Theory, and I will demonstrate how such canonical computations can be implemented into simple and biologically plausible neural circuits. Finally, I will show how such principles of sensory information processing can be leveraged in virtual and augmented reality to overcome display limitations and expand human perception.

SeminarNeuroscienceRecording

Towards a Theory of Human Visual Reasoning

Ekaterina Shurkova
University of Edinburgh
Oct 13, 2021

Many tasks that are easy for humans are difficult for machines. In particular, while humans excel at tasks that require generalising across problems, machine systems notably struggle. One such task that has received a good amount of attention is the Synthetic Visual Reasoning Test (SVRT). The SVRT consists of a range of problems where simple visual stimuli must be categorised into one of two categories based on an unknown rule that must be induced. Conventional machine learning approaches perform well only when trained to categorise based on a single rule and are unable to generalise without extensive additional training to tasks with any additional rules. Multiple theories of higher-level cognition posit that humans solve such tasks using structured relational representations. Specifically, people learn rules based on structured representations that generalise to novel instances quickly and easily. We believe it is possible to model this approach in a single system which learns all the required relational representations from scratch and performs tasks such as SVRT in a single run. Here, we present a system which expands the DORA/LISA architecture and augments the existing model with principally novel components, namely a) visual reasoning based on the established theories of recognition by components; b) the process of learning complex relational representations by synthesis (in addition to learning by analysis). The proposed augmented model matches human behaviour on SVRT problems. Moreover, the proposed system stands as perhaps a more realistic account of human cognition, wherein rather than using tools that has been shown successful in the machine learning field to inform psychological theorising, we use established psychological theories to inform developing a machine system.

SeminarOpen SourceRecording

PiVR: An affordable and versatile closed-loop platform to study unrestrained sensorimotor behavior

David Tadres and Matthieu Louis
University of California, Santa Barbara
Sep 2, 2021

PiVR is a system that allows experimenters to immerse small animals into virtual realities. The system tracks the position of the animal and presents light stimulation according to predefined rules, thus creating a virtual landscape in which the animal can behave. By using optogenetics, we have used PiVR to present fruit fly larvae with virtual olfactory realities, adult fruit flies with a virtual gustatory reality and zebrafish larvae with a virtual light gradient. PiVR operates at high temporal resolution (70Hz) with low latencies (<30 milliseconds) while being affordable (<US$500) and easy to build (<6 hours). Through extensive documentation (www.PiVR.org), this tool was designed to be accessible to a wide public, from high school students to professional researchers studying systems neuroscience in academia.

SeminarNeuroscience

Neural circuits that support robust and flexible navigation in dynamic naturalistic environments

Hannah Haberkern
HHMI Janelia Research Campus
Aug 15, 2021

Tracking heading within an environment is a fundamental requirement for flexible, goal-directed navigation. In insects, a head-direction representation that guides the animal’s movements is maintained in a conserved brain region called the central complex. Two-photon calcium imaging of genetically targeted neural populations in the central complex of tethered fruit flies behaving in virtual reality (VR) environments has shown that the head-direction representation is updated based on self-motion cues and external sensory information, such as visual features and wind direction. Thus far, the head direction representation has mainly been studied in VR settings that only give flies control of the angular rotation of simple sensory cues. How the fly’s head direction circuitry enables the animal to navigate in dynamic, immersive and naturalistic environments is largely unexplored. I have developed a novel setup that permits imaging in complex VR environments that also accommodate flies’ translational movements. I have previously demonstrated that flies perform visually-guided navigation in such an immersive VR setting, and also that they learn to associate aversive optogenetically-generated heat stimuli with specific visual landmarks. A stable head direction representation is likely necessary to support such behaviors, but the underlying neural mechanisms are unclear. Based on a connectomic analysis of the central complex, I identified likely circuit mechanisms for prioritizing and combining different sensory cues to generate a stable head direction representation in complex, multimodal environments. I am now testing these predictions using calcium imaging in genetically targeted cell types in flies performing 2D navigation in immersive VR.

SeminarNeuroscienceRecording

Imperial Neurotechnology 2021 - Annual Research Symposium

Yulong Li, Christos Kapatos, Mary Ann Go, Sonja Hofer, Oscar Bates, Christian Wilms
Peking University, SERG Technologies, Imperial College, UCL, Scientifica Ltd
Jul 6, 2021

A diverse mix of neurotechnology talks from academic and industry colleagues plus presentations from our MRes Neurotechnology students. Visit our event page to find out more and register now!

SeminarNeuroscienceRecording

Investigating the sun compass in monarch butterflies (Danaus plexippus)

Tu Anh Nguyen Thi
el Jundi lab, University Würzburg
Jun 1, 2021

Every autumn, monarch butterflies migrate from North America to their overwintering sites in Central Mexico. To maintain their southward direction, these butterflies rely on celestial cues as orientation references. The position of the sun combined with additional skylight cues are integrated in the central complex, a region in the butterfly’s brain that acts as an internal compass. However, the central complex does not solely guide the butterflies on their migration but also helps monarchs in their non-migratory form manoeuvre on foraging trips through their habitat. By comparing the activity of input neurons of the central complex between migratory and non-migratory butterflies, we investigated how a different lifestyle affects the coding of orientation information in the brain.

SeminarNeuroscienceRecording

The emergence of contrast invariance in cortical circuits

Tatjana Tchumatchenko
Max Planck Institute for Brain Research
Nov 12, 2020

Neurons in the primary visual cortex (V1) encode the orientation and contrast of visual stimuli through changes in firing rate (Hubel and Wiesel, 1962). Their activity typically peaks at a preferred orientation and decays to zero at the orientations that are orthogonal to the preferred. This activity pattern is re-scaled by contrast but its shape is preserved, a phenomenon known as contrast invariance. Contrast-invariant selectivity is also observed at the population level in V1 (Carandini and Sengpiel, 2004). The mechanisms supporting the emergence of contrast-invariance at the population level remain unclear. How does the activity of different neurons with diverse orientation selectivity and non-linear contrast sensitivity combine to give rise to contrast-invariant population selectivity? Theoretical studies have shown that in the balance limit, the properties of single-neurons do not determine the population activity (van Vreeswijk and Sompolinsky, 1996). Instead, the synaptic dynamics (Mongillo et al., 2012) as well as the intracortical connectivity (Rosenbaum and Doiron, 2014) shape the population activity in balanced networks. We report that short-term plasticity can change the synaptic strength between neurons as a function of the presynaptic activity, which in turns modifies the population response to a stimulus. Thus, the same circuit can process a stimulus in different ways –linearly, sublinearly, supralinearly – depending on the properties of the synapses. We found that balanced networks with excitatory to excitatory short-term synaptic plasticity cannot be contrast-invariant. Instead, short-term plasticity modifies the network selectivity such that the tuning curves are narrower (broader) for increasing contrast if synapses are facilitating (depressing). Based on these results, we wondered whether balanced networks with plastic synapses (other than short-term) can support the emergence of contrast-invariant selectivity. Mathematically, we found that the only synaptic transformation that supports perfect contrast invariance in balanced networks is a power-law release of neurotransmitter as a function of the presynaptic firing rate (in the excitatory to excitatory and in the excitatory to inhibitory neurons). We validate this finding using spiking network simulations, where we report contrast-invariant tuning curves when synapses release the neurotransmitter following a power- law function of the presynaptic firing rate. In summary, we show that synaptic plasticity controls the type of non-linear network response to stimulus contrast and that it can be a potential mechanism mediating the emergence of contrast invariance in balanced networks with orientation-dependent connectivity. Our results therefore connect the physiology of individual synapses to the network level and may help understand the establishment of contrast-invariant selectivity.

ePoster

Development of a novel VR-based system for quantitative assessment of freezing of gait in Parkinson disease

Shota Emoto, Kazuhide Seo, Toru Ishii, Kanako Abe, Masahito Miki, Toshimasa Yamamoto, Masayuki Hara

FENS Forum 2024

ePoster

Effects of VRK1 deficiency on the neurophysiology and behavior of zebrafish

Magdeline Carrasco Apolinario, Ryohei Umeda, Hitoshi Teranishi, Mengting Shan, Phurpa Phurpa, Nobuyuki Shimizu, Hiroshi Shiraishi, Kenshiro Shikano, Takatoshi Hikida, Toshikatsu Hanada, Reiko Hanada

FENS Forum 2024

ePoster

Modelling Koolen-de Vries syndrome in neural organoids

Brooke Latour, Spencer Shute, Iris Teunnissen van Manen, Jolanthe Lingeman, Katrin Linda, Anouk Verboven, Emma Dyke, Chantal Schoenmaker, Haico van Attikum, Nael Nadif Kasri

FENS Forum 2024

ePoster

Neuronal activities during a VR-based assessment for Autism Spectrum Disorder: A pilot EEG study

Chun-Chuan Chen, Yu Jung Tseng, Hui-Ju Chen, Tzu-Ling Lin, Yu-Hsin Huang, Shih-Ching Yeh, Eric Hsiao Kuang Wu

FENS Forum 2024

ePoster

Unraveling human escape planning: The impact of environmental cues on escape behavior in VR

Lukas Kornemann, Sajjad Zabbah, Yonatan Hutabarat, Dominik R. Bach

FENS Forum 2024

ePoster

VRK2 deficiency elicits aggressive behavior in female zebrafish

Ryohei Umeda, Nobuyuki Shimizu, Hitoshi Teranishi, Kenshiro Shikano, Hirotaro Urushibata, Hiroshi Shiraishi, Takatoshi Hikida, Toshikatsu Hanada, Reiko Hanada

FENS Forum 2024