495 research outputs found

    Contrast sensitivity of insect motion detectors to natural images

    Get PDF
    How do animals regulate self-movement despite large variation in the luminance contrast of the environment? Insects are capable of regulating flight speed based on the velocity of image motion, but the mechanisms for this are unclear. The Hassenstein–Reichardt correlator model and elaborations can accurately predict responses of motion detecting neurons under many conditions but fail to explain the apparent lack of spatial pattern and contrast dependence observed in freely flying bees and flies. To investigate this apparent discrepancy, we recorded intracellularly from horizontal-sensitive (HS) motion detecting neurons in the hoverfly while displaying moving images of natural environments. Contrary to results obtained with grating patterns, we show these neurons encode the velocity of natural images largely independently of the particular image used despite a threefold range of contrast. This invariance in response to natural images is observed in both strongly and minimally motion-adapted neurons but is sensitive to artificial manipulations in contrast. Current models of these cells account for some, but not all, of the observed insensitivity to image contrast. We conclude that fly visual processing may be matched to commonalities between natural scenes, enabling accurate estimates of velocity largely independent of the particular scene

    Vision Egg: an Open-Source Library for Realtime Visual Stimulus Generation

    Get PDF
    Modern computer hardware makes it possible to produce visual stimuli in ways not previously possible. Arbitrary scenes, from traditional sinusoidal gratings to naturalistic 3D scenes can now be specified on a frame-by-frame basis in realtime. A programming library called the Vision Egg that aims to make it easy to take advantage of these innovations. The Vision Egg is a free, open-source library making use of OpenGL and written in the high-level language Python with extensions in C. Careful attention has been paid to the issues of luminance and temporal calibration, and several interfacing techniques to input devices such as mice, movement tracking systems, and digital triggers are discussed. Together, these make the Vision Egg suitable for many psychophysical, electrophysiological, and behavioral experiments. This software is available for free download at visionegg.org

    A `bright zone' in male hoverfly (Eristalis tenax) eyes and associated faster motion detection and increased contrast sensitivity

    Get PDF
    Eyes of the hoverfly Eristalis tenax are sexually dimorphic such that males have a fronto-dorsal region of large facets. In contrast to other large flies in which large facets are associated with a decreased interommatidial angle to form a dorsal `acute zone' of increased spatial resolution, we show that a dorsal region of large facets in males appears to form a `bright zone' of increased light capture without substantially increased spatial resolution. Theoretically, more light allows for increased performance in tasks such as motion detection. To determine the effect of the bright zone on motion detection, local properties of wide field motion detecting neurons were investigated using localized sinusoidal gratings. The pattern of local preferred directions of one class of these cells, the HS cells, in Eristalis is similar to that reported for the blowfly Calliphora. The bright zone seems to contribute to local contrast sensitivity; high contrast sensitivity exists in portions of the receptive field served by large diameter facet lenses of males and is not observed in females. Finally, temporal frequency tuning is also significantly faster in this frontal portion of the world, particularly in males, where it overcompensates for the higher spatial-frequency tuning and shifts the predicted local velocity optimum to higher speeds. These results indicate that increased retinal illuminance due to the bright zone of males is used to enhance contrast sensitivity and speed motion detector responses. Additionally, local neural properties vary across the visual world in a way not expected if HS cells serve purely as matched filters to measure yaw-induced visual motion

    Active and Passive Antennal Movements during Visually Guided Steering in Flying Drosophila

    Get PDF
    Insects use feedback from a variety of sensory modalities, including mechanoreceptors on their antennae, to stabilize the direction and speed of flight. Like all arthropod appendages, antennae not only supply sensory information but may also be actively positioned by control muscles. However, how flying insects move their antennae during active turns and how such movements might influence steering responses are currently unknown. Here we examined the antennal movements of flying Drosophila during visually induced turns in a tethered flight arena. In response to both rotational and translational patterns of visual motion, Drosophila actively moved their antennae in a direction opposite to that of the visual motion. We also observed two types of passive antennal movements: small tonic deflections of the antenna and rapid oscillations at wing beat frequency. These passive movements are likely the result of wing-induced airflow and increased in magnitude when the angular distance between the wing and the antenna decreased. In response to rotational visual motion, increases in passive antennal movements appear to trigger a reflex that reduces the stroke amplitude of the contralateral wing, thereby enhancing the visually induced turn. Although the active antennal movements significantly increased antennal oscillation by bringing the arista closer to the wings, it did not significantly affect the turning response in our head-fixed, tethered flies. These results are consistent with the hypothesis that flying Drosophila use mechanosensory feedback to detect changes in the wing induced airflow during visually induced turns and that this feedback plays a role in regulating the magnitude of steering responses

    Multi-camera Realtime 3D Tracking of Multiple Flying Animals

    Full text link
    Automated tracking of animal movement allows analyses that would not otherwise be possible by providing great quantities of data. The additional capability of tracking in realtime - with minimal latency - opens up the experimental possibility of manipulating sensory feedback, thus allowing detailed explorations of the neural basis for control of behavior. Here we describe a new system capable of tracking the position and body orientation of animals such as flies and birds. The system operates with less than 40 msec latency and can track multiple animals simultaneously. To achieve these results, a multi target tracking algorithm was developed based on the Extended Kalman Filter and the Nearest Neighbor Standard Filter data association algorithm. In one implementation, an eleven camera system is capable of tracking three flies simultaneously at 60 frames per second using a gigabit network of nine standard Intel Pentium 4 and Core 2 Duo computers. This manuscript presents the rationale and details of the algorithms employed and shows three implementations of the system. An experiment was performed using the tracking system to measure the effect of visual contrast on the flight speed of Drosophila melanogaster. At low contrasts, speed is more variable and faster on average than at high contrasts. Thus, the system is already a useful tool to study the neurobiology and behavior of freely flying animals. If combined with other techniques, such as `virtual reality'-type computer graphics or genetic manipulation, the tracking system would offer a powerful new way to investigate the biology of flying animals.Comment: pdfTeX using libpoppler 3.141592-1.40.3-2.2 (Web2C 7.5.6), 18 pages with 9 figure

    Visual control of flight speed in Drosophila melanogaster

    Get PDF
    Flight control in insects depends on self-induced image motion (optic flow), which the visual system must process to generate appropriate corrective steering maneuvers. Classic experiments in tethered insects applied rigorous system identification techniques for the analysis of turning reactions in the presence of rotating pattern stimuli delivered in open-loop. However, the functional relevance of these measurements for visual free-flight control remains equivocal due to the largely unknown effects of the highly constrained experimental conditions. To perform a systems analysis of the visual flight speed response under free-flight conditions, we implemented a `one-parameter open-loop' paradigm using `TrackFly' in a wind tunnel equipped with real-time tracking and virtual reality display technology. Upwind flying flies were stimulated with sine gratings of varying temporal and spatial frequencies, and the resulting speed responses were measured from the resulting flight speed reactions. To control flight speed, the visual system of the fruit fly extracts linear pattern velocity robustly over a broad range of spatio–temporal frequencies. The speed signal is used for a proportional control of flight speed within locomotor limits. The extraction of pattern velocity over a broad spatio–temporal frequency range may require more sophisticated motion processing mechanisms than those identified in flies so far. In Drosophila, the neuromotor pathways underlying flight speed control may be suitably explored by applying advanced genetic techniques, for which our data can serve as a baseline. Finally, the high-level control principles identified in the fly can be meaningfully transferred into a robotic context, such as for the robust and efficient control of autonomous flying micro air vehicles

    Visual Control of Altitude in Flying Drosophila

    Get PDF
    Unlike creatures that walk, flying animals need to control their horizontal motion as well as their height above the ground. Research on insects, the first animals to evolve flight, has revealed several visual reflexes that are used to govern horizontal course. For example, insects orient toward prominent vertical features in their environment [1], [2], [3], [4] and [5] and generate compensatory reactions to both rotations [6] and [7] and translations [1], [8], [9], [10] and [11] of the visual world. Insects also avoid impending collisions by veering away from visual expansion [9], [12], [13] and [14]. In contrast to this extensive understanding of the visual reflexes that regulate horizontal course, the sensory-motor mechanisms that animals use to control altitude are poorly understood. Using a 3D virtual reality environment, we found that Drosophila utilize three reflexes—edge tracking, wide-field stabilization, and expansion avoidance—to control altitude. By implementing a dynamic visual clamp, we found that flies do not regulate altitude by maintaining a fixed value of optic flow beneath them, as suggested by a recent model [15]. The results identify a means by which insects determine their absolute height above the ground and uncover a remarkable correspondence between the sensory-motor algorithms used to regulate motion in the horizontal and vertical domains

    Biologically Inspired Feedback Design for Drosophila Flight

    Get PDF
    We use a biologically motivated model of the Drosophila's flight mechanics and sensor processing to design a feedback control scheme to regulate forward flight. The model used for insect flight is the grand unified fly (GUF) [3] simulation consisting of rigid body kinematics, aerodynamic forces and moments, sensory systems, and a 3D environment model. We seek to design a control algorithm that will convert the sensory signals into proper wing beat commands to regulate forward flight. Modulating the wing beat frequency and mean stroke angle produces changes in the flight envelope. The sensory signals consist of estimates of rotational velocity from the haltere organs and translational velocity estimates from visual elementary motion detectors (EMD's) and matched retinal velocity filters. The controller is designed based on a longitudinal model of the flight dynamics. Feedforward commands are generated based on a desired forward velocity. The dynamics are linearized around this operating point and a feedback controller designed to correct deviations from the operating point. The control algorithm is implemented in the GUF simulator and achieves the desired tracking of the forward reference velocities and exhibits biologically realistic responses
    corecore