207 research outputs found

    Local motion adaptation enhances the representation of spatial structure at EMD arrays

    Get PDF
    Li J, Lindemann JP, Egelhaaf M. Local motion adaptation enhances the representation of spatial structure at EMD arrays. PLOS Computational Biology. 2017;13(12): e1005919.Neuronal representation and extraction of spatial information are essential for behavioral control. For flying insects, a plausible way to gain spatial information is to exploit distancedependent optic flow that is generated during translational self-motion. Optic flow is computed by arrays of local motion detectors retinotopically arranged in the second neuropile layer of the insect visual system. These motion detectors have adaptive response characteristics, i.e. their responses to motion with a constant or only slowly changing velocity decrease, while their sensitivity to rapid velocity changes is maintained or even increases. We analyzed by a modeling approach how motion adaptation affects signal representation at the output of arrays of motion detectors during simulated flight in artificial and natural 3D environments. We focused on translational flight, because spatial information is only contained in the optic flow induced by translational locomotion. Indeed, flies, bees and other insects segregate their flight into relatively long intersaccadic translational flight sections interspersed with brief and rapid saccadic turns, presumably to maximize periods of translation (80% of the flight). With a novel adaptive model of the insect visual motion pathway we could show that the motion detector responses to background structures of cluttered environments are largely attenuated as a consequence of motion adaptation, while responses to foreground objects stay constant or even increase. This conclusion even holds under the dynamic flight conditions of insects

    Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis

    Get PDF
    Knowing the depth structure of the environment is crucial for moving animals in many behavioral contexts, such as collision avoidance, targeting objects, or spatial navigation. An important source of depth information is motion parallax. This powerful cue is generated on the eyes during translatory self-motion with the retinal images of nearby objects moving faster than those of distant ones. To investigate how the visual motion pathway represents motion-based depth information we analyzed its responses to image sequences recorded in natural cluttered environments with a wide range of depth structures. The analysis was done on the basis of an experimentally validated model of the visual motion pathway of insects, with its core elements being correlation-type elementary motion detectors (EMDs). It is the key result of our analysis that the absolute EMD responses, i.e. the motion energy profile, represent the contrast-weighted nearness of environmental structures during translatory self-motion at a roughly constant velocity. In other words, the output of the EMD array highlights contours of nearby objects. This conclusion is largely independent of the scale over which EMDs are spatially pooled and was corroborated by scrutinizing the motion energy profile after eliminating the depth structure from the natural image sequences. Hence, the well-established dependence of correlation-type EMDs on both velocity and textural properties of motion stimuli appears to be advantageous for representing behaviorally relevant information about the environment in a computationally parsimonious way

    Motion as a source of environmental information: a fresh view on biological motion computation by insect brains

    Get PDF
    Egelhaaf M, Kern R, Lindemann JP. Motion as a source of environmental information: a fresh view on biological motion computation by insect brains. Frontiers in Neural Circuits. 2014;8:127.Despite their miniature brains insects, such as flies, bees and wasps, are able to navigate by highly erobatic flight maneuvers in cluttered environments. They rely on spatial information that is contained in the retinal motion patterns induced on the eyes while moving around (“optic flow”) to accomplish their extraordinary performance. Thereby, they employ an active flight and gaze strategy that separates rapid saccade-like turns from translatory flight phases where the gaze direction is kept largely constant. This behavioral strategy facilitates the processing of environmental information, because information about the distance of the animal to objects in the environment is only contained in the optic flow generated by translatory motion. However, motion detectors as are widespread in biological systems do not represent veridically the velocity of the optic flow vectors, but also reflect textural information about the environment. This characteristic has often been regarded as a limitation of a biological motion detection mechanism. In contrast, we conclude from analyses challenging insect movement detectors with image flow as generated during translatory locomotion through cluttered natural environments that this mechanism represents the contours of nearby objects. Contrast borders are a main carrier of functionally relevant object information in artificial and natural sceneries. The motion detection system thus segregates in a computationally parsimonious way the environment into behaviorally relevant nearby objects and—in many behavioral contexts—less relevant distant structures. Hence, by making use of an active flight and gaze strategy, insects are capable of performing extraordinarily well even with a computationally simple motion detection mechanism

    A neurobiological and computational analysis of target discrimination in visual clutter by the insect visual system.

    Get PDF
    Some insects have the capability to detect and track small moving objects, often against cluttered moving backgrounds. Determining how this task is performed is an intriguing challenge, both from a physiological and computational perspective. Previous research has characterized higher-order neurons within the fly brain known as 'small target motion detectors‘ (STMD) that respond selectively to targets, even within complex moving surrounds. Interestingly, these cells still respond robustly when the velocity of the target is matched to the velocity of the background (i.e. with no relative motion cues). We performed intracellular recordings from intermediate-order neurons in the fly visual system (the medulla). These full-wave rectifying, transient cells (RTC) reveal independent adaptation to luminance changes of opposite signs (suggesting separate 'on‘ and 'off‘ channels) and fast adaptive temporal mechanisms (as seen in some previously described cell types). We show, via electrophysiological experiments, that the RTC is temporally responsive to rapidly changing stimuli and is well suited to serving an important function in a proposed target-detecting pathway. To model this target discrimination, we use high dynamic range (HDR) natural images to represent 'real-world‘ luminance values that serve as inputs to a biomimetic representation of photoreceptor processing. Adaptive spatiotemporal high-pass filtering (1st-order interneurons) shapes the transient 'edge-like‘ responses, useful for feature discrimination. Following this, a model for the RTC implements a nonlinear facilitation between the rapidly adapting, and independent polarity contrast channels, each with centre-surround antagonism. The recombination of the channels results in increased discrimination of small targets, of approximately the size of a single pixel, without the need for relative motion cues. This method of feature discrimination contrasts with traditional target and background motion-field computations. We show that our RTC-based target detection model is well matched to properties described for the higher-order STMD neurons, such as contrast sensitivity, height tuning and velocity tuning. The model output shows that the spatiotemporal profile of small targets is sufficiently rare within natural scene imagery to allow our highly nonlinear 'matched filter‘ to successfully detect many targets from the background. The model produces robust target discrimination across a biologically plausible range of target sizes and a range of velocities. We show that the model for small target motion detection is highly correlated to the velocity of the stimulus but not other background statistics, such as local brightness or local contrast, which normally influence target detection tasks. From an engineering perspective, we examine model elaborations for improved target discrimination via inhibitory interactions from correlation-type motion detectors, using a form of antagonism between our feature correlator and the more typical motion correlator. We also observe that a changing optimal threshold is highly correlated to the value of observer ego-motion. We present an elaborated target detection model that allows for implementation of a static optimal threshold, by scaling the target discrimination mechanism with a model-derived velocity estimation of ego-motion. Finally, we investigate the physiological relevance of this target discrimination model. We show that via very subtle image manipulation of the visual stimulus, our model accurately predicts dramatic changes in observed electrophysiological responses from STMD neurons.Thesis (Ph.D.) - University of Adelaide, School of Molecular and Biomedical Science, 200

    Information processing in visual systems

    No full text
    One of the goals of neuroscience is to understand how animals perceive sensory information. This thesis focuses on visual systems, to unravel how neuronal structures process aspects of the visual environment. To characterise the receptive field of a neuron, we developed spike-triggered independent component analysis. Alongside characterising the receptive field of a neuron, this method provides an insight into its underlying network structure. When applied to recordings from the H1 neuron of blowflies, it accurately recovered the sub-structure of the neuron. This sub-structure was studied further by recording H1's response to plaid stimuli. Based on the response, H1 can be classified as a component cell. We then fitted an anatomically inspired model to the response, and found the critical component to explain H1's response to be a sigmoid non-linearity at output of elementary movement detectors. The simpler blowfly visual system can help us understand elementary sensory information processing mechanisms. How does the more complex mammalian cortex implement these principles in its network? To study this, we used multi-electrode arrays to characterise the receptive field properties of neurons in the visual cortex of anaesthetised mice. Based on these recordings, we estimated the cortical limits on the performance of a visual task; the behavioural performance observed by Prusky and Douglas (2004) is within these limits. Our recordings were carried out in anaesthetised animals. During anaesthesia, cortical UP states are considered "fragments of wakefulness" and from simultaneous whole-cell and extracellular recordings, we found these states to be revealed in the phase of local field potentials. This finding was used to develop a method of detecting cortical state based on extracellular recordings, which allows us to explore information processing during different cortical states. Across this thesis, we have developed, tested and applied methods that help improve our understanding of information processing in visual systems

    Bio-Inspired Optic Flow Sensors for Artificial Compound Eyes.

    Full text link
    Compound eyes in flying insects have been studied to reveal the mysterious cues of vision-based flying mechanisms inside the smallest flying creatures in nature. Especially, researchers in the robotic area have made efforts to transfer the findings into their less than palm-sized unmanned air vehicles, micro-air-vehicles (MAVs). The miniaturized artificial compound eye is one of the key components in this system to provide visual information for navigation. Multi-directional sensing and motion estimation capabilities can give wide field-of-view (FoV) optic flows up to 360 solid angle. By deciphering the wide FoV optic flows, relevant information on the self-status of flight is parsed and utilized for flight command generation. In this work, we realize the wide-field optic flow sensing in a pseudo-hemispherical configuration realized by mounting a number of 2D array optic flow sensors on a flexible PCB module. The flexible PCBs can be bent to form a compound eye shape by origami packaging. From this scheme, the multiple 2D optic flow sensors can provide a modular, expandable configuration to meet low power constraints. The 2D optic flow sensors satisfy the low power constraint by employing a novel bio-inspired algorithm. We have modified the conventional elementary motion detector (EMD), which is known to be a basic operational unit in the insect’s visual pathways. We have implemented a bio-inspired time-stamp-based algorithm in mixed-mode circuits for robust operation. By optimal partitioning of analog to digital signal domains, we can realize the algorithm mostly in digital domain in a column-parallel circuits. Only the feature extraction algorithm is incorporated inside a pixel in analog circuits. In addition, the sensors integrate digital peripheral circuits to provide modular expandability. The on-chip data compressor can reduce the data rate by a factor of 8, so that it can connect a total of 25 optic flow sensors in a 4-wired Serial Peripheral Interface (SPI) bus. The packaged compound eye can transmit full-resolution optic flow data through the single 3MB/sec SPI bus. The fabricated 2D optic flow prototype sensor has achieved the power consumption of 243.3pJ/pixel and the maximum detectable optic flow of 1.96rad/sec at 120fps and 60 FoV.PhDElectrical EngineeringUniversity of Michigan, Horace H. Rackham School of Graduate Studieshttp://deepblue.lib.umich.edu/bitstream/2027.42/108841/1/sssjpark_1.pd
    • 

    corecore