27 research outputs found

    Emergent spatial goals in an integrative model of the insect central complex

    Get PDF
    The insect central complex appears to encode and process spatial information through vector manipulation. Here, we draw on recent insights into circuit structure to fuse previous models of sensory-guided navigation, path integration and vector memory. Specifically, we propose that the allocentric encoding of location provided by path integration creates a spatially stable anchor for converging sensory signals that is relevant in multiple behavioural contexts. The allocentric reference frame given by path integration transforms a goal direction into a goal location and we demonstrate through modelling that it can enhance approach of a sensory target in noisy, cluttered environments or with temporally sparse stimuli. We further show the same circuit can improve performance in the more complex navigational task of route following. The model suggests specific functional roles for circuit elements of the central complex that helps explain their high preservation across insect species

    A unified mechanism for innate and learned visual landmark guidance in the insect central complex

    Get PDF
    Insects can navigate efficiently in both novel and familiar environments, and this requires flexiblity in how they are guided by sensory cues. A prominent landmark, for example, can elicit strong innate behaviours (attraction or menotaxis) but can also be used, after learning, as a specific directional cue as part of a navigation memory. However, the mechanisms that allow both pathways to co-exist, interact or override each other are largely unknown. Here we propose a model for the behavioural integration of innate and learned guidance based on the neuroanatomy of the central complex (CX), adapted to control landmark guided behaviours. We consider a reward signal provided either by an innate attraction to landmarks or a long-term visual memory in the mushroom bodies (MB) that modulates the formation of a local vector memory in the CX. Using an operant strategy for a simulated agent exploring a simple world containing a single visual cue, we show how the generated shortterm memory can support both innate and learned steering behaviour. In addition, we show how this architecture is consistent with the observed effects of unilateral MB lesions in ants that cause a reversion to innate behaviour. We suggest the formation of a directional memory in the CX can be interpreted as transforming rewarding (positive or negative) sensory signals into a mapping of the environment that describes the geometrical attractiveness (or repulsion). We discuss how this scheme might represent an ideal way to combine multisensory information gathered during the exploration of an environment and support optimal cue integration

    A motion compensation treadmill for untethered wood ants (Formica rufa): evidence for transfer of orientation memories from free-walking training

    Get PDF
    The natural scale of insect navigation during foraging makes it challenging to study under controlled conditions. Virtual reality and trackball setups have offered experimental control over visual environments while studying tethered insects, but potential limitations and confounds introduced by tethering motivates the development of alternative untethered solutions. In this paper, we validate the use of a motion compensator (or ‘treadmill’) to study visually driven behaviour of freely moving wood ants (Formica rufa). We show how this setup allows naturalistic walking behaviour and preserves foraging motivation over long time frames. Furthermore, we show that ants are able to transfer associative and navigational memories from classical maze and arena contexts to our treadmill. Thus, we demonstrate the possibility to study navigational behaviour over ecologically relevant durations (and virtual distances) in precisely controlled environments, bridging the gap between natural and highly controlled laboratory experiments

    Mushroom bodies are required for learnt visual navigation, but not for innate visual behaviour, in ants

    Get PDF
    Visual navigation in ants has long been a focus of experimental study [1, 2, 3], but only recently have explicit hypotheses about the underlying neural circuitry been proposed [4]. Indirect evidence suggests the mushroom bodies (MBs) may be the substrate for visual memory in navigation tasks [5, 6, 7], while computational modeling shows that MB neural architecture could support this function [8, 9]. There is, however, no direct evidence that ants require MBs for visual navigation. Here we show that lesions of MB calyces impair ants’ visual navigation to a remembered food location yet leave their innate responses to visual cues unaffected. Wood ants are innately attracted to large visual cues, but we trained them to locate a food source at a specific angle away from such a cue. Subsequent lesioning of the MB calyces using procaine hydrochloride injection caused ants to revert toward their innate cue attraction. Handling and saline injection control ants still approached the feeder. Path straightness of lesioned and control ants did not differ from each other but was lower than during training. Reversion toward the cue direction occurred irrespective of whether the visual cue was ipsi- or contralateral to the lesion site, showing this is not due simply to an induced motor bias. Monocular occlusion did not diminish ants’ ability to locate the feeder, suggesting that MB lesions are not merely interrupting visual input to the calyx. The demonstrated dissociation between innate and learned visual responses provides direct evidence for a specific role of the MB in navigational memory

    Modeling visual-based pitch, lift and speed control strategies in hoverflies

    Get PDF
    <div><p>To avoid crashing onto the floor, a free falling fly needs to trigger its wingbeats quickly and control the orientation of its thrust accurately and swiftly to stabilize its pitch and hence its speed. Behavioural data have suggested that the vertical optic flow produced by the fall and crossing the visual field plays a key role in this anti-crash response. Free fall behavior analyses have also suggested that flying insect may not rely on graviception to stabilize their flight. Based on these two assumptions, we have developed a model which accounts for hoverflies´ position and pitch orientation recorded in 3D with a fast stereo camera during experimental free falls. Our dynamic model shows that optic flow-based control combined with closed-loop control of the pitch suffice to stabilize the flight properly. In addition, our model sheds a new light on the visual-based feedback control of fly´s pitch, lift and thrust. Since graviceptive cues are possibly not used by flying insects, the use of a vertical reference to control the pitch is discussed, based on the results obtained on a complete dynamic model of a virtual fly falling in a textured corridor. This model would provide a useful tool for understanding more clearly how insects may or not estimate their absolute attitude.</p></div

    Visuo-inertial stabilization in flies : looking for an accelerometer

    No full text
    Malgré des ressources neuronales et un système visuel à faible résolution spatiale, les insectes présentent des capacités de vol extraordinaires. En particulier chez les mouches c’est la vitesse d’exécution des processus sensorimoteurs de contrôle du vol qui a attiré l’attention des scientifiques. De nombreux travaux ont mis en évidence la capacité des diptères à percevoir les mouvements, par l’intermédiaire de la vision ou des balanciers (organe agissant tel un gyromètre dans les 3 dimensions de l’espace), pour compenser nombre de perturbations qui pourraient altérer la stabilité de leur vol. Cependant, ces mécanismes de compensation peuvent être sujet à une accumulation d’erreurs au cours du temps. Au cours de cette thèse nous avons donc posé la question suivante : Existe-t-il chez les diptères une référence verticale qui leur permet de connaître leur orientation absolue dans l’espace et qui participe au contrôle du vol ?\u2028 Tout d’abord, à l’instar de ce qui existe chez les vertébrés, nous avons tenté d’explorer l’hypothèse d’un système inertiel analogue à l’oreille interne chez les insectes volants. De manière à tester la possible perception de l’orientation de la gravité nous avons mis au point un dispositif de chute libre pour les insectes. \u2028Nous avons ensuite étudié en détail les mécanismes visuels qui participent à cette stabilisation, tout d’abord en évaluant une stratégie basée uniquement sur la perception du mouvement visuel : le flux optique, très étudié chez les insectes. Nous avons cherché l’existence de repères statiques liés à la verticale dans l’environnement visuel, aux rangs desquels l’horizon fait figure de candidat.Despite low neuronal resources and a low spatial resolution vision, flying insects exhibit a large repertoire of complex behaviours. Particularly, some species are able to hover for long period of time in front of flowers or congeners. The ratio between their low resources and their complex behaviours made insects perfect models to understand the sensorimotor transformation in neuronal systems. In flies, the extraordinary pace of these processes has interrogated scientists. Thus, numbers of papers shed in light the incredible dipteran’s capacities to detect movement, thanks to vision or halteres (organs acting as a 3-D gyrometer). But the compensating mechanisms associated with those perceptions are exposed to errors accumulation and could led to crash. Thus, we asked the following question: Do dipteran could be able to achieve such complex task to hover without any estimation of their absolute orientation within gravity?\u2028First, we developed a free fall setup adapted to small insects to evaluate their ability to detect inertially the free fall state, which is possible with an accelerometer or an inner ear.\u2028Then, we developed a model based on their aptitude to control their cruising flight thanks to optic flow perception. We demonstrated that the well-known mechanism of optic flow regulation already described in bees tends to counteract free fall. Finally, we investigated the role of visual static cues linked to the horizon to show the importance of light distribution in the environment in hoverfly to ensure flight stability

    Model_dataset

    No full text

    Central complex EPG-PFL3 modulation model simulation data

    No full text
    Central complex model simulations data. This dataset regroups the different experiment conducted on an EPG-PFL3 synaptic modulation model of the insect central complex to explain landmark guidance behaviour. Each folder includes all the XY-theta data and the neuron activity for the different condition tested.Goulard, Roman. (2021). Central complex EPG-PFL3 modulation model simulation data, [dataset]. University of Edinburgh. School of Informatics. Institute of Perception, Action & Behaviour. https://doi.org/10.7488/ds/2999

    To crash or not to crash: how do hoverflies cope with free-fall situations and weightlessness?

    No full text
    International audienceInsects' aptitude to perform hovering, automatic landing and tracking tasks involves accurately controlling their head and body roll and pitch movements, but how this attitude control depends on an internal estimation of gravity orientation is still an open question. Gravity perception in flying insects has mainly been studied in terms of grounded animals' tactile orientation responses, but it has not yet been established whether hoverflies use gravity perception cues to detect a nearly weightless state at an early stage. Ground-based microgravity simulators provide biologists with useful tools for studying the effects of changes in gravity. However, in view of the cost and the complexity of these set-ups, an alternative Earth-based free-fall procedure was developed with which flying insects can be briefly exposed to microgravity under various visual conditions. Hoverflies frequently initiated wingbeats in response to an imposed free fall in all the conditions tested, but managed to avoid crashing only in variably structured visual environments, and only episodically in darkness. Our results reveal that the crash-avoidance performance of these insects in various visual environments suggests the existence of a multisensory control system based mainly on vision rather than gravity perception
    corecore