13,997 research outputs found

    The influence of visual landscape on the free flight behavior of the fruit fly Drosophila melanogaster

    Get PDF
    To study the visual cues that control steering behavior in the fruit fly Drosophila melanogaster, we reconstructed three-dimensional trajectories from images taken by stereo infrared video cameras during free flight within structured visual landscapes. Flies move through their environment using a series of straight flight segments separated by rapid turns, termed saccades, during which the fly alters course by approximately 90° in less than 100 ms. Altering the amount of background visual contrast caused significant changes in the fly’s translational velocity and saccade frequency. Between saccades, asymmetries in the estimates of optic flow induce gradual turns away from the side experiencing a greater motion stimulus, a behavior opposite to that predicted by a flight control model based upon optomotor equilibrium. To determine which features of visual motion trigger saccades, we reconstructed the visual environment from the fly’s perspective for each position in the flight trajectory. From these reconstructions, we modeled the fly’s estimation of optic flow on the basis of a two-dimensional array of Hassenstein–Reichardt elementary motion detectors and, through spatial summation, the large-field motion stimuli experienced by the fly during the course of its flight. Event-triggered averages of the large-field motion preceding each saccade suggest that image expansion is the signal that triggers each saccade. The asymmetry in output of the local motion detector array prior to each saccade influences the direction (left versus right) but not the magnitude of the rapid turn. Once initiated, visual feedback does not appear to influence saccade kinematics further. The total expansion experienced before a saccade was similar for flight within both uniform and visually textured backgrounds. In summary, our data suggest that complex behavioral patterns seen during free flight emerge from interactions between the flight control system and the visual environment

    Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action

    Get PDF
    Egelhaaf M, Boeddeker N, Kern R, Kurtz R, Lindemann JP. Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action. Frontiers in Neural Circuits. 2012;6:108.Insects such as flies or bees, with their miniature brains, are able to control highly aerobatic flight maneuvres and to solve spatial vision tasks, such as avoiding collisions with obstacles, landing on objects, or even localizing a previously learnt inconspicuous goal on the basis of environmental cues. With regard to solving such spatial tasks, these insects still outperform man-made autonomous flying systems. To accomplish their extraordinary performance, flies and bees have been shown by their characteristic behavioral actions to actively shape the dynamics of the image flow on their eyes ("optic flow"). The neural processing of information about the spatial layout of the environment is greatly facilitated by segregating the rotational from the translational optic flow component through a saccadic flight and gaze strategy. This active vision strategy thus enables the nervous system to solve apparently complex spatial vision tasks in a particularly efficient and parsimonious way. The key idea of this review is that biological agents, such as flies or bees, acquire at least part of their strength as autonomous systems through active interactions with their environment and not by simply processing passively gained information about the world. These agent-environment interactions lead to adaptive behavior in surroundings of a wide range of complexity. Animals with even tiny brains, such as insects, are capable of performing extraordinarily well in their behavioral contexts by making optimal use of the closed action-perception loop. Model simulations and robotic implementations show that the smart biological mechanisms of motion computation and visually-guided flight control might be helpful to find technical solutions, for example, when designing micro air vehicles carrying a miniaturized, low-weight on-board processor

    Generalized Regressive Motion: a Visual Cue to Collision

    Get PDF
    Brains and sensory systems evolved to guide motion. Central to this task is controlling the approach to stationary obstacles and detecting moving organisms. Looming has been proposed as the main monocular visual cue for detecting the approach of other animals and avoiding collisions with stationary obstacles. Elegant neural mechanisms for looming detection have been found in the brain of insects and vertebrates. However, looming has not been analyzed in the context of collisions between two moving animals. We propose an alternative strategy, Generalized Regressive Motion (GRM), which is consistent with recently observed behavior in fruit flies. Geometric analysis proves that GRM is a reliable cue to collision among conspecifics, whereas agent-based modeling suggests that GRM is a better cue than looming as a means to detect approach, prevent collisions and maintain mobility

    Binocular interactions underlying the classic optomotor responses of flying flies.

    Get PDF
    In response to imposed course deviations, the optomotor reactions of animals reduce motion blur and facilitate the maintenance of stable body posture. In flies, many anatomical and electrophysiological studies suggest that disparate motion cues stimulating the left and right eyes are not processed in isolation but rather are integrated in the brain to produce a cohesive panoramic percept. To investigate the strength of such inter-ocular interactions and their role in compensatory sensory-motor transformations, we utilize a virtual reality flight simulator to record wing and head optomotor reactions by tethered flying flies in response to imposed binocular rotation and monocular front-to-back and back-to-front motion. Within a narrow range of stimulus parameters that generates large contrast insensitive optomotor responses to binocular rotation, we find that responses to monocular front-to-back motion are larger than those to panoramic rotation, but are contrast sensitive. Conversely, responses to monocular back-to-front motion are slower than those to rotation and peak at the lowest tested contrast. Together our results suggest that optomotor responses to binocular rotation result from the influence of non-additive contralateral inhibitory as well as excitatory circuit interactions that serve to confer contrast insensitivity to flight behaviors influenced by rotatory optic flow

    Visual control of flight speed in Drosophila melanogaster

    Get PDF
    Flight control in insects depends on self-induced image motion (optic flow), which the visual system must process to generate appropriate corrective steering maneuvers. Classic experiments in tethered insects applied rigorous system identification techniques for the analysis of turning reactions in the presence of rotating pattern stimuli delivered in open-loop. However, the functional relevance of these measurements for visual free-flight control remains equivocal due to the largely unknown effects of the highly constrained experimental conditions. To perform a systems analysis of the visual flight speed response under free-flight conditions, we implemented a `one-parameter open-loop' paradigm using `TrackFly' in a wind tunnel equipped with real-time tracking and virtual reality display technology. Upwind flying flies were stimulated with sine gratings of varying temporal and spatial frequencies, and the resulting speed responses were measured from the resulting flight speed reactions. To control flight speed, the visual system of the fruit fly extracts linear pattern velocity robustly over a broad range of spatio–temporal frequencies. The speed signal is used for a proportional control of flight speed within locomotor limits. The extraction of pattern velocity over a broad spatio–temporal frequency range may require more sophisticated motion processing mechanisms than those identified in flies so far. In Drosophila, the neuromotor pathways underlying flight speed control may be suitably explored by applying advanced genetic techniques, for which our data can serve as a baseline. Finally, the high-level control principles identified in the fly can be meaningfully transferred into a robotic context, such as for the robust and efficient control of autonomous flying micro air vehicles

    Answer Set Programming Modulo `Space-Time'

    Full text link
    We present ASP Modulo `Space-Time', a declarative representational and computational framework to perform commonsense reasoning about regions with both spatial and temporal components. Supported are capabilities for mixed qualitative-quantitative reasoning, consistency checking, and inferring compositions of space-time relations; these capabilities combine and synergise for applications in a range of AI application areas where the processing and interpretation of spatio-temporal data is crucial. The framework and resulting system is the only general KR-based method for declaratively reasoning about the dynamics of `space-time' regions as first-class objects. We present an empirical evaluation (with scalability and robustness results), and include diverse application examples involving interpretation and control tasks

    Learning recurrent representations for hierarchical behavior modeling

    Get PDF
    We propose a framework for detecting action patterns from motion sequences and modeling the sensory-motor relationship of animals, using a generative recurrent neural network. The network has a discriminative part (classifying actions) and a generative part (predicting motion), whose recurrent cells are laterally connected, allowing higher levels of the network to represent high level phenomena. We test our framework on two types of data, fruit fly behavior and online handwriting. Our results show that 1) taking advantage of unlabeled sequences, by predicting future motion, significantly improves action detection performance when training labels are scarce, 2) the network learns to represent high level phenomena such as writer identity and fly gender, without supervision, and 3) simulated motion trajectories, generated by treating motion prediction as input to the network, look realistic and may be used to qualitatively evaluate whether the model has learnt generative control rules

    Collective behaviour without collective order in wild swarms of midges

    Get PDF
    Collective behaviour is a widespread phenomenon in biology, cutting through a huge span of scales, from cell colonies up to bird flocks and fish schools. The most prominent trait of collective behaviour is the emergence of global order: individuals synchronize their states, giving the stunning impression that the group behaves as one. In many biological systems, though, it is unclear whether global order is present. A paradigmatic case is that of insect swarms, whose erratic movements seem to suggest that group formation is a mere epiphenomenon of the independent interaction of each individual with an external landmark. In these cases, whether or not the group behaves truly collectively is debated. Here, we experimentally study swarms of midges in the field and measure how much the change of direction of one midge affects that of other individuals. We discover that, despite the lack of collective order, swarms display very strong correlations, totally incompatible with models of noninteracting particles. We find that correlation increases sharply with the swarm's density, indicating that the interaction between midges is based on a metric perception mechanism. By means of numerical simulations we demonstrate that such growing correlation is typical of a system close to an ordering transition. Our findings suggest that correlation, rather than order, is the true hallmark of collective behaviour in biological systems.Comment: The original version has been split into two parts. This first part focuses on order vs. correlation. The second part, about finite-size scaling, will be included in a separate paper. 15 pages, 6 figures, 1 table, 5 video
    corecore