130 research outputs found

    Vision Egg: an Open-Source Library for Realtime Visual Stimulus Generation

    Get PDF
    Modern computer hardware makes it possible to produce visual stimuli in ways not previously possible. Arbitrary scenes, from traditional sinusoidal gratings to naturalistic 3D scenes can now be specified on a frame-by-frame basis in realtime. A programming library called the Vision Egg that aims to make it easy to take advantage of these innovations. The Vision Egg is a free, open-source library making use of OpenGL and written in the high-level language Python with extensions in C. Careful attention has been paid to the issues of luminance and temporal calibration, and several interfacing techniques to input devices such as mice, movement tracking systems, and digital triggers are discussed. Together, these make the Vision Egg suitable for many psychophysical, electrophysiological, and behavioral experiments. This software is available for free download at visionegg.org

    Ohio Northern Serves Community Both Home and Abroad

    Get PDF

    Contrast sensitivity of insect motion detectors to natural images

    Get PDF
    How do animals regulate self-movement despite large variation in the luminance contrast of the environment? Insects are capable of regulating flight speed based on the velocity of image motion, but the mechanisms for this are unclear. The Hassenstein–Reichardt correlator model and elaborations can accurately predict responses of motion detecting neurons under many conditions but fail to explain the apparent lack of spatial pattern and contrast dependence observed in freely flying bees and flies. To investigate this apparent discrepancy, we recorded intracellularly from horizontal-sensitive (HS) motion detecting neurons in the hoverfly while displaying moving images of natural environments. Contrary to results obtained with grating patterns, we show these neurons encode the velocity of natural images largely independently of the particular image used despite a threefold range of contrast. This invariance in response to natural images is observed in both strongly and minimally motion-adapted neurons but is sensitive to artificial manipulations in contrast. Current models of these cells account for some, but not all, of the observed insensitivity to image contrast. We conclude that fly visual processing may be matched to commonalities between natural scenes, enabling accurate estimates of velocity largely independent of the particular scene

    A `bright zone' in male hoverfly (Eristalis tenax) eyes and associated faster motion detection and increased contrast sensitivity

    Get PDF
    Eyes of the hoverfly Eristalis tenax are sexually dimorphic such that males have a fronto-dorsal region of large facets. In contrast to other large flies in which large facets are associated with a decreased interommatidial angle to form a dorsal `acute zone' of increased spatial resolution, we show that a dorsal region of large facets in males appears to form a `bright zone' of increased light capture without substantially increased spatial resolution. Theoretically, more light allows for increased performance in tasks such as motion detection. To determine the effect of the bright zone on motion detection, local properties of wide field motion detecting neurons were investigated using localized sinusoidal gratings. The pattern of local preferred directions of one class of these cells, the HS cells, in Eristalis is similar to that reported for the blowfly Calliphora. The bright zone seems to contribute to local contrast sensitivity; high contrast sensitivity exists in portions of the receptive field served by large diameter facet lenses of males and is not observed in females. Finally, temporal frequency tuning is also significantly faster in this frontal portion of the world, particularly in males, where it overcompensates for the higher spatial-frequency tuning and shifts the predicted local velocity optimum to higher speeds. These results indicate that increased retinal illuminance due to the bright zone of males is used to enhance contrast sensitivity and speed motion detector responses. Additionally, local neural properties vary across the visual world in a way not expected if HS cells serve purely as matched filters to measure yaw-induced visual motion

    Active and Passive Antennal Movements during Visually Guided Steering in Flying Drosophila

    Get PDF
    Insects use feedback from a variety of sensory modalities, including mechanoreceptors on their antennae, to stabilize the direction and speed of flight. Like all arthropod appendages, antennae not only supply sensory information but may also be actively positioned by control muscles. However, how flying insects move their antennae during active turns and how such movements might influence steering responses are currently unknown. Here we examined the antennal movements of flying Drosophila during visually induced turns in a tethered flight arena. In response to both rotational and translational patterns of visual motion, Drosophila actively moved their antennae in a direction opposite to that of the visual motion. We also observed two types of passive antennal movements: small tonic deflections of the antenna and rapid oscillations at wing beat frequency. These passive movements are likely the result of wing-induced airflow and increased in magnitude when the angular distance between the wing and the antenna decreased. In response to rotational visual motion, increases in passive antennal movements appear to trigger a reflex that reduces the stroke amplitude of the contralateral wing, thereby enhancing the visually induced turn. Although the active antennal movements significantly increased antennal oscillation by bringing the arista closer to the wings, it did not significantly affect the turning response in our head-fixed, tethered flies. These results are consistent with the hypothesis that flying Drosophila use mechanosensory feedback to detect changes in the wing induced airflow during visually induced turns and that this feedback plays a role in regulating the magnitude of steering responses

    Multi-camera Realtime 3D Tracking of Multiple Flying Animals

    Full text link
    Automated tracking of animal movement allows analyses that would not otherwise be possible by providing great quantities of data. The additional capability of tracking in realtime - with minimal latency - opens up the experimental possibility of manipulating sensory feedback, thus allowing detailed explorations of the neural basis for control of behavior. Here we describe a new system capable of tracking the position and body orientation of animals such as flies and birds. The system operates with less than 40 msec latency and can track multiple animals simultaneously. To achieve these results, a multi target tracking algorithm was developed based on the Extended Kalman Filter and the Nearest Neighbor Standard Filter data association algorithm. In one implementation, an eleven camera system is capable of tracking three flies simultaneously at 60 frames per second using a gigabit network of nine standard Intel Pentium 4 and Core 2 Duo computers. This manuscript presents the rationale and details of the algorithms employed and shows three implementations of the system. An experiment was performed using the tracking system to measure the effect of visual contrast on the flight speed of Drosophila melanogaster. At low contrasts, speed is more variable and faster on average than at high contrasts. Thus, the system is already a useful tool to study the neurobiology and behavior of freely flying animals. If combined with other techniques, such as `virtual reality'-type computer graphics or genetic manipulation, the tracking system would offer a powerful new way to investigate the biology of flying animals.Comment: pdfTeX using libpoppler 3.141592-1.40.3-2.2 (Web2C 7.5.6), 18 pages with 9 figure
    corecore