Resources
Authors & Affiliations
Angel Canelo,Sungyong Kim,Anmo Kim
Abstract
Flying insects can process multiple visual features in parallel neural circuits and generate an appropriate action. Neural processing of singly presented visual patterns has been studied intensively in Drosophila for the past few decades. How do parallel visual circuits responding to different features, presented in a single visual scene, are integrated to control a shared motor circuit? An influential theory proposed for combining multiple sensorimotor circuits is an efference copy mechanism, in which an intended action offsets other sensory circuits to prevent them from responding to reafferent sensory inputs caused by the action. Recent studies in Drosophila have identified efference copy-like signals in an array of motion-sensitive visual neurons that mediate visual stability reflexes. Using a dynamical systems approach, we implemented two computational models that combine the stability reflex with spontaneous or other visually evoked flight controls such as object tracking and avoidance. The model demonstrates that the visual stability reflex dampens spontaneous as well as visual object-induced flight turns when combined additively and that the modulation of the stability reflex by an efference copy permits undamped, concurrent operation of multiple visual behaviors. Finally, we show that a simple supervised learning model can adjust its efference copy to match variations in sensory feedback associated with changes in internal or environmental variables. Our study provides an integrative model of vision-based flight control when multiple visual features are presented simultaneously and may be extended to an adaptive flight control mechanism for artificial flying agents such as drones.