31 research outputs found

    The mapping of visual space by identified large second-order neurons in the dragonfly media ocellus

    Get PDF
    In adult dragonflies, the compound eyes are augmented by three simple eyes known as the dorsal ocelli. The outputs of ocellar photoreceptors converge on relatively few second-order neurons with large axonal diameters (L-neurons). We determine L-neuron morphology by iontophoretic dye injection combined with three-dimensional reconstructions. Using intracellular recording and white noise analysis, we also determine the physiological receptive fields of the L-neurons, in order to identify the extent to which they preserve spatial information. We find a total of 11 median ocellar L-neurons, consisting of five symmetrical pairs and one unpaired neuron. L-neurons are distinguishable by the extent and location of their terminations within the ocellar plexus and brain. In the horizontal dimension, L-neurons project to different regions of the ocellar plexus, in close correlation with their receptive fields. In the vertical dimension, dendritic arborizations overlap widely, paralleled by receptive fields that are narrow and do not differ between different neurons. These results provide the first evidence for the preservation of spatial information by the second-order neurons of any dorsal ocellus. The system essentially forms a one-dimensional image of the equator over a wide azimuthal area, possibly forming an internal representation of the horizon. Potential behavioural roles for the system are discussed

    Grasshopper DCMD : an undergraduate electrophysiology lab for investigating single-unit responses to behaviorally-relevant stimuli

    Get PDF
    Author Posting. © Faculty for Undergraduate Neuroscience, 2017. This article is posted here by permission of Faculty for Undergraduate Neuroscience for personal use, not for redistribution. The definitive version was published in Journal of Undergraduate Neuroscience Education 15 (2017): A162-A173.Avoiding capture from a fast-approaching predator is an important survival skill shared by many animals. Investigating the neural circuits that give rise to this escape behavior can provide a tractable demonstration of systems-level neuroscience research for undergraduate laboratories. In this paper, we describe three related hands-on exercises using the grasshopper and affordable technology to bring neurophysiology, neuroethology, and neural computation to life and enhance student understanding and interest. We simplified a looming stimuli procedure using the Backyard Brains SpikerBox bioamplifier, an open-source and low-cost electrophysiology rig, to extracellularly record activity of the descending contralateral movement detector (DCMD) neuron from the grasshopper’s neck. The DCMD activity underlies the grasshopper's motor responses to looming monocular visual cues and can easily be recorded and analyzed on an open-source iOS oscilloscope app, Spike Recorder. Visual stimuli are presented to the grasshopper by this same mobile application allowing for synchronized recording of stimuli and neural activity. An in-app spike-sorting algorithm is described that allows a quick way for students to record, sort, and analyze their data at the bench. We also describe a way for students to export these data to other analysis tools. With the protocol described, students will be able to prepare the grasshopper, find and record from the DCMD neuron, and visualize the DCMD responses to quantitatively investigate the escape system by adjusting the speed and size of simulated approaching objects. We describe the results from 22 grasshoppers, where 50 of the 57 recording sessions (87.7%) had a reliable DCMD response. Finally, we field-tested our experiment in an undergraduate neuroscience laboratory and found that a majority of students (67%) could perform this exercise in one two-hour lab setting, and had an increase in interest for studying the neural systems that drive behavior.Funding for this project was supported by the National Institute of Mental Health Small Business Innovation Research grant #2R44MH093334: “Backyard Brains: Bringing Neurophysiology into Secondary Schools.

    The Killer Fly Hunger Games: Target Size and Speed Predict Decision to Pursuit.

    Get PDF
    Predatory animals have evolved to optimally detect their prey using exquisite sensory systems such as vision, olfaction and hearing. It may not be so surprising that vertebrates, with large central nervous systems, excel at predatory behaviors. More striking is the fact that many tiny insects, with their miniscule brains and scaled down nerve cords, are also ferocious, highly successful predators. For predation, it is important to determine whether a prey is suitable before initiating pursuit. This is paramount since pursuing a prey that is too large to capture, subdue or dispatch will generate a substantial metabolic cost (in the form of muscle output) without any chance of metabolic gain (in the form of food). In addition, during all pursuits, the predator breaks its potential camouflage and thus runs the risk of becoming prey itself. Many insects use their eyes to initially detect and subsequently pursue prey. Dragonflies, which are extremely efficient predators, therefore have huge eyes with relatively high spatial resolution that allow efficient prey size estimation before initiating pursuit. However, much smaller insects, such as killer flies, also visualize and successfully pursue prey. This is an impressive behavior since the small size of the killer fly naturally limits the neural capacity and also the spatial resolution provided by the compound eye. Despite this, we here show that killer flies efficiently pursue natural (Drosophila melanogaster) and artificial (beads) prey. The natural pursuits are initiated at a distance of 7.9 ± 2.9 cm, which we show is too far away to allow for distance estimation using binocular disparities. Moreover, we show that rather than estimating absolute prey size prior to launching the attack, as dragonflies do, killer flies attack with high probability when the ratio of the prey's subtended retinal velocity and retinal size is 0.37. We also show that killer flies will respond to a stimulus of an angular size that is smaller than that of the photoreceptor acceptance angle, and that the predatory response is strongly modulated by the metabolic state. Our data thus provide an exciting example of a loosely designed matched filter to Drosophila, but one which will still generate successful pursuits of other suitable prey.This work was funded the Air force Office of Scientific Research (FA9550-10-0472 to Prof. Robert Olberg). An Isaac Newton Trust / Wellcome Trust ISSF / University of Cambridge Joint Research Grant to Gonzalez-Bellido. BBSRC TO TREVOR WARDILL The Swedish Research Council (2012-4740) to Nordström and a Shared Equipment Grant from the School of Biological Sciences (U. of Cambridge).This is the final version of the article. It first appeared from Karger via http://dx.doi.org/10.1159/00043594

    DESIGN, CONSTRUCTION, AND TESTING OF A FLYING PREY SIMULATOR

    Get PDF
    ABSTRACT The goal of this research project is to investigate the neuronal control of flying prey interception in dragonflies b

    A lightweight, inexpensive robotic system for insect vision

    Get PDF
    Designing hardware for miniaturized robotics which mimics the capabilities of flying insects is of interest, because they share similar constraints (i.e. small size, low weight, and low energy consumption). Research in this area aims to enable robots with similarly efficient flight and cognitive abilities. Visual processing is important to flying insects' impressive flight capabilities, but currently, embodiment of insect-like visual systems is limited by the hardware systems available. Suitable hardware is either prohibitively expensive, difficult to reproduce, cannot accurately simulate insect vision characteristics, and/or is too heavy for small robotic platforms. These limitations hamper the development of platforms for embodiment which in turn hampers the progress on understanding of how biological systems fundamentally works. To address this gap, this paper proposes an inexpensive, lightweight robotic system for modelling insect vision. The system is mounted and tested on a robotic platform for mobile applications, and then the camera and insect vision models are evaluated. We analyse the potential of the system for use in embodiment of higher-level visual processes (i.e. motion detection) and also for development of navigation based on vision for robotics in general. Optic flow from sample camera data is calculated and compared to a perfect, simulated bee world showing an excellent resemblance

    Binocular Encoding in the Damselfly Pre-motor Target Tracking System.

    Get PDF
    Akin to all damselflies, Calopteryx (family Calopterygidae), commonly known as jewel wings or demoiselles, possess dichoptic (separated) eyes with overlapping visual fields of view. In contrast, many dragonfly species possess holoptic (dorsally fused) eyes with limited binocular overlap. We have here compared the neuronal correlates of target tracking between damselfly and dragonfly sister lineages and linked these changes in visual overlap to pre-motor neural adaptations. Although dragonflies attack prey dorsally, we show that demoiselles attack prey frontally. We identify demoiselle target-selective descending neurons (TSDNs) with matching frontal visual receptive fields, anatomically and functionally homologous to the dorsally positioned dragonfly TSDNs. By manipulating visual input using eyepatches and prisms, we show that moving target information at the pre-motor level depends on binocular summation in demoiselles. Consequently, demoiselles encode directional information in a binocularly fused frame of reference such that information of a target moving toward the midline in the left eye is fused with information of the target moving away from the midline in the right eye. This contrasts with dragonfly TSDNs, where receptive fields possess a sharp midline boundary, confining responses to a single visual hemifield in a sagittal frame of reference (i.e., relative to the midline). Our results indicate that, although TSDNs are conserved across Odonata, their neural inputs, and thus the upstream organization of the target tracking system, differ significantly and match divergence in eye design and predatory strategies. VIDEO ABSTRACT

    The Herschel-Heterodyne Instrument for the Far-Infrared (HIFI): instrument and pre-launch testing

    Get PDF
    This paper describes the Heterodyne Instrument for the Far-Infrared (HIFI), to be launched onboard of ESA's Herschel Space Observatory, by 2008. It includes the first results from the instrument level tests. The instrument is designed to be electronically tuneable over a wide and continuous frequency range in the Far Infrared, with velocity resolutions better than 0.1 km/s with a high sensitivity. This will enable detailed investigations of a wide variety of astronomical sources, ranging from solar system objects, star formation regions to nuclei of galaxies. The instrument comprises 5 frequency bands covering 480-1150 GHz with SIS mixers and a sixth dual frequency band, for the 1410-1910 GHz range, with Hot Electron Bolometer Mixers (HEB). The Local Oscillator (LO) subsystem consists of a dedicated Ka-band synthesizer followed by 7 times 2 chains of frequency multipliers, 2 chains for each frequency band. A pair of Auto-Correlators and a pair of Acousto-Optic spectrometers process the two IF signals from the dual-polarization front-ends to provide instantaneous frequency coverage of 4 GHz, with a set of resolutions (140 kHz to 1 MHz), better than < 0.1 km/s. After a successful qualification program, the flight instrument was delivered and entered the testing phase at satellite level. We will also report on the pre-flight test and calibration results together with the expected in-flight performance

    Wireless neural/emg telemetry systems for small freely moving animals

    No full text
    Abstract—We have developed a miniature telemetry system that captures neural, EMG, and acceleration signals from a freely moving insect and transmits the data wirelessly to a remote digital receiver. The system is based on a custom low-power integrated circuit that amplifies and digitizes four biopotential signals as well as three acceleration signals from an off-chip MEMS accelerometer, and transmits this information over a wireless 920-MHz telemetry link. The unit weighs 0.79 g and runs for two hours on two small batteries. We have used this system to monitor neural and EMG signals in jumping and flying locusts. I
    corecore